Mar 10 09:03:32 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 09:03:33 crc restorecon[4557]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 09:03:33 crc restorecon[4557]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 09:03:33 crc kubenswrapper[4883]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.940391 4883 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944767 4883 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944788 4883 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944793 4883 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944797 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944801 4883 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944805 4883 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944809 4883 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944813 4883 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944817 4883 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944822 4883 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944827 4883 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944832 4883 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944837 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944842 4883 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944847 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944851 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944855 4883 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944858 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944862 4883 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944865 4883 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944870 4883 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944873 4883 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944877 4883 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944881 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944884 4883 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944888 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944892 4883 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944896 4883 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944900 4883 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944904 4883 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944909 4883 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944913 4883 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944916 4883 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944920 4883 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944924 4883 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944930 4883 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944934 4883 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944938 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944942 4883 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944945 4883 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944948 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944952 4883 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944956 4883 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944959 4883 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944964 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944967 4883 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944971 4883 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944974 4883 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944978 4883 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944981 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944987 4883 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944991 4883 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.944994 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945000 4883 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945005 4883 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945010 4883 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945015 4883 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945019 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945023 4883 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945026 4883 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945030 4883 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945033 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945036 4883 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945040 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945043 4883 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945047 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945050 4883 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945054 4883 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945057 4883 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945061 4883 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.945065 4883 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945139 4883 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945148 4883 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945157 4883 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945163 4883 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945169 4883 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945174 4883 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945183 4883 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945188 4883 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945193 4883 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945197 4883 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945202 4883 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945207 4883 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945211 4883 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945216 4883 flags.go:64] FLAG: --cgroup-root="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945220 4883 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945224 4883 flags.go:64] FLAG: --client-ca-file="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945228 4883 flags.go:64] FLAG: --cloud-config="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945233 4883 flags.go:64] FLAG: --cloud-provider="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945237 4883 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945243 4883 flags.go:64] FLAG: --cluster-domain="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945247 4883 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945252 4883 flags.go:64] FLAG: --config-dir="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945256 4883 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945261 4883 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945266 4883 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945270 4883 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945275 4883 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945279 4883 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945283 4883 flags.go:64] FLAG: --contention-profiling="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945288 4883 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945292 4883 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945296 4883 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945300 4883 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945305 4883 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945309 4883 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945313 4883 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945317 4883 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945323 4883 flags.go:64] FLAG: --enable-server="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945328 4883 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945334 4883 flags.go:64] FLAG: --event-burst="100" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945339 4883 flags.go:64] FLAG: --event-qps="50" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945343 4883 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945348 4883 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945352 4883 flags.go:64] FLAG: --eviction-hard="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945357 4883 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945362 4883 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945366 4883 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945370 4883 flags.go:64] FLAG: --eviction-soft="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945375 4883 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945378 4883 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945383 4883 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945387 4883 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945391 4883 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945395 4883 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945399 4883 flags.go:64] FLAG: --feature-gates="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945405 4883 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945410 4883 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945415 4883 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945419 4883 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945424 4883 flags.go:64] FLAG: --healthz-port="10248" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945428 4883 flags.go:64] FLAG: --help="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945432 4883 flags.go:64] FLAG: --hostname-override="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945436 4883 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945441 4883 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945445 4883 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945449 4883 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945453 4883 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945458 4883 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945462 4883 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945487 4883 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945492 4883 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945496 4883 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945500 4883 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945503 4883 flags.go:64] FLAG: --kube-reserved="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945508 4883 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945512 4883 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945516 4883 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945520 4883 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945524 4883 flags.go:64] FLAG: --lock-file="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945528 4883 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945546 4883 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945551 4883 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945558 4883 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945564 4883 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945568 4883 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945573 4883 flags.go:64] FLAG: --logging-format="text" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945578 4883 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945583 4883 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945588 4883 flags.go:64] FLAG: --manifest-url="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945592 4883 flags.go:64] FLAG: --manifest-url-header="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945598 4883 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945603 4883 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.945654 4883 flags.go:64] FLAG: --max-pods="110" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946660 4883 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946670 4883 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946676 4883 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946684 4883 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946691 4883 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946731 4883 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946738 4883 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946761 4883 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946768 4883 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946776 4883 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946781 4883 flags.go:64] FLAG: --pod-cidr="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946785 4883 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946797 4883 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946802 4883 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946808 4883 flags.go:64] FLAG: --pods-per-core="0" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946816 4883 flags.go:64] FLAG: --port="10250" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946822 4883 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946827 4883 flags.go:64] FLAG: --provider-id="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946832 4883 flags.go:64] FLAG: --qos-reserved="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946839 4883 flags.go:64] FLAG: --read-only-port="10255" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.946847 4883 flags.go:64] FLAG: --register-node="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947068 4883 flags.go:64] FLAG: --register-schedulable="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947080 4883 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947094 4883 flags.go:64] FLAG: --registry-burst="10" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947099 4883 flags.go:64] FLAG: --registry-qps="5" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947104 4883 flags.go:64] FLAG: --reserved-cpus="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947108 4883 flags.go:64] FLAG: --reserved-memory="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947114 4883 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947118 4883 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947123 4883 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947127 4883 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947131 4883 flags.go:64] FLAG: --runonce="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947136 4883 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947140 4883 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947145 4883 flags.go:64] FLAG: --seccomp-default="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947150 4883 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947154 4883 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947159 4883 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947163 4883 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947168 4883 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947173 4883 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947178 4883 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947183 4883 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947187 4883 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947191 4883 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947196 4883 flags.go:64] FLAG: --system-cgroups="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947200 4883 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947210 4883 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947215 4883 flags.go:64] FLAG: --tls-cert-file="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947219 4883 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947228 4883 flags.go:64] FLAG: --tls-min-version="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947233 4883 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947237 4883 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947241 4883 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947246 4883 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947252 4883 flags.go:64] FLAG: --v="2" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947272 4883 flags.go:64] FLAG: --version="false" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947278 4883 flags.go:64] FLAG: --vmodule="" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947283 4883 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947288 4883 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947456 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947464 4883 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947468 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947489 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947493 4883 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947497 4883 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947503 4883 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947507 4883 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947511 4883 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947516 4883 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947522 4883 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947526 4883 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947542 4883 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947546 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947550 4883 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947553 4883 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947557 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947560 4883 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947563 4883 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947567 4883 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947570 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947574 4883 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947578 4883 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947583 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947586 4883 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947590 4883 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947594 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947598 4883 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947601 4883 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947605 4883 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947609 4883 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947616 4883 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947621 4883 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947624 4883 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947628 4883 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947631 4883 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947635 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947639 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947649 4883 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947653 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947657 4883 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947661 4883 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947666 4883 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947670 4883 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947674 4883 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947678 4883 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947681 4883 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947685 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947689 4883 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947693 4883 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947696 4883 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947700 4883 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947705 4883 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947710 4883 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947714 4883 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947718 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947722 4883 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947726 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947731 4883 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947734 4883 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947739 4883 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947743 4883 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947746 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947755 4883 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947759 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947762 4883 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947766 4883 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947769 4883 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947772 4883 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947776 4883 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.947779 4883 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.947792 4883 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.955338 4883 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.955374 4883 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955452 4883 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955468 4883 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955484 4883 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955490 4883 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955495 4883 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955499 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955504 4883 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955510 4883 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955516 4883 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955520 4883 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955524 4883 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955529 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955542 4883 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955546 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955550 4883 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955553 4883 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955557 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955561 4883 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955566 4883 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955571 4883 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955576 4883 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955580 4883 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955585 4883 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955589 4883 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955594 4883 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955599 4883 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955602 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955607 4883 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955611 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955615 4883 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955620 4883 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955625 4883 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955629 4883 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955633 4883 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955639 4883 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955643 4883 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955648 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955652 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955656 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955660 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955664 4883 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955668 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955671 4883 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955675 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955679 4883 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955682 4883 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955686 4883 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955691 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955695 4883 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955699 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955704 4883 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955708 4883 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955712 4883 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955717 4883 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955721 4883 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955725 4883 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955729 4883 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955732 4883 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955736 4883 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955739 4883 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955743 4883 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955746 4883 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955750 4883 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955754 4883 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955757 4883 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955761 4883 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955764 4883 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955768 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955774 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955778 4883 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955782 4883 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.955792 4883 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955926 4883 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955932 4883 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955937 4883 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955941 4883 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955945 4883 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955949 4883 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955953 4883 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955957 4883 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955961 4883 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955966 4883 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955970 4883 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955975 4883 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955979 4883 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955983 4883 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955987 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955990 4883 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955994 4883 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.955998 4883 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956002 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956007 4883 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956012 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956016 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956021 4883 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956025 4883 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956028 4883 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956033 4883 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956037 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956041 4883 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956045 4883 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956049 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956053 4883 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956057 4883 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956061 4883 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956065 4883 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956075 4883 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956079 4883 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956083 4883 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956086 4883 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956090 4883 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956094 4883 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956098 4883 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956101 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956105 4883 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956109 4883 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956113 4883 feature_gate.go:330] unrecognized feature gate: Example Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956116 4883 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956120 4883 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956124 4883 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956129 4883 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956135 4883 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956140 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956144 4883 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956148 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956152 4883 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956157 4883 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956162 4883 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956167 4883 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956172 4883 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956176 4883 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956181 4883 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956185 4883 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956189 4883 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956193 4883 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956197 4883 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956201 4883 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956205 4883 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956208 4883 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956212 4883 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956215 4883 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956219 4883 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 09:03:33 crc kubenswrapper[4883]: W0310 09:03:33.956228 4883 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.956234 4883 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.956395 4883 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 09:03:33 crc kubenswrapper[4883]: E0310 09:03:33.958935 4883 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.961567 4883 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.961643 4883 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.963095 4883 server.go:997] "Starting client certificate rotation" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.963121 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.963272 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.975716 4883 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:03:33 crc kubenswrapper[4883]: E0310 09:03:33.977814 4883 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.978108 4883 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:03:33 crc kubenswrapper[4883]: I0310 09:03:33.990658 4883 log.go:25] "Validated CRI v1 runtime API" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.008839 4883 log.go:25] "Validated CRI v1 image API" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.010198 4883 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.013008 4883 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-09-00-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.013032 4883 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.026558 4883 manager.go:217] Machine: {Timestamp:2026-03-10 09:03:34.02512306 +0000 UTC m=+0.280020970 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f194c89e-85d8-4bba-8c7a-70d8bbd420b2 BootID:0ffe6628-2ca8-4f77-b1d4-26329720410f Filesystems:[{Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c7:90:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:c7:90:bf Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:88:52:cb Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:34:2a:b5 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:14:ab:93 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:5d:58:d5 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:5e:e8:79:33:54:70 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:db:3a:b6:ac:97 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.026723 4883 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.026814 4883 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027067 4883 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027224 4883 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027257 4883 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027424 4883 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027434 4883 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027746 4883 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027780 4883 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027922 4883 state_mem.go:36] "Initialized new in-memory state store" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.027994 4883 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.029553 4883 kubelet.go:418] "Attempting to sync node with API server" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.029577 4883 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.029600 4883 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.029612 4883 kubelet.go:324] "Adding apiserver pod source" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.029624 4883 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.031381 4883 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.032261 4883 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.033380 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.033451 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.033376 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.033521 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.033690 4883 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035565 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035589 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035597 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035604 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035618 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035625 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035634 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035646 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035655 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035665 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035680 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.035688 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.036085 4883 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.036570 4883 server.go:1280] "Started kubelet" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.037151 4883 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 09:03:34 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.037159 4883 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.040034 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.040175 4883 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.041719 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.041771 4883 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.042124 4883 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.042141 4883 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.042552 4883 server.go:460] "Adding debug handlers to kubelet server" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.042229 4883 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.042152 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.043567 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.043666 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.043672 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="200ms" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.044250 4883 factory.go:55] Registering systemd factory Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.044278 4883 factory.go:221] Registration of the systemd container factory successfully Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.047923 4883 factory.go:153] Registering CRI-O factory Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.047961 4883 factory.go:221] Registration of the crio container factory successfully Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.048024 4883 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.048051 4883 factory.go:103] Registering Raw factory Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.048097 4883 manager.go:1196] Started watching for new ooms in manager Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.047959 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.140:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b6f721305d50f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,LastTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.049100 4883 manager.go:319] Starting recovery of all containers Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054459 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054513 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054526 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054543 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054553 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054562 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054571 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054580 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054591 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054599 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054610 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054618 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054643 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054653 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054664 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054673 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054683 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054691 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054700 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054709 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054731 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054742 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054751 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054761 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054774 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054782 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054798 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054809 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054818 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054828 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054837 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054846 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054856 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054865 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054873 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054882 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054890 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054899 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054908 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054916 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054925 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054934 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054945 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054954 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054963 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054971 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054980 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054988 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.054998 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055007 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055017 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055027 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055039 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055049 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055058 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055068 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055080 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055088 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055098 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055117 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055129 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055137 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055147 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055158 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055167 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055177 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055186 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055217 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055228 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055237 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055247 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055256 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055264 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055273 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055283 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055291 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055300 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055309 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055321 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055349 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055360 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055369 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055379 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055388 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055397 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055406 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055416 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055426 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055437 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055447 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055456 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055466 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055501 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055510 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055543 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055552 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055563 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055571 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055581 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055589 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055599 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055607 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055616 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055626 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055648 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055660 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055669 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055680 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055689 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055698 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055710 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055720 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055733 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055743 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055754 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055764 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055773 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055781 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055791 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055798 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055808 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055817 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055826 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055835 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055844 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055852 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055861 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055873 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055882 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.055892 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056801 4883 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056822 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056835 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056845 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056855 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056867 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056876 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056886 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056895 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056905 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056924 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056936 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056950 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056961 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056971 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056981 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.056991 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057001 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057011 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057020 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057028 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057038 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057052 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057061 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057071 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057080 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057089 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057098 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057108 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057117 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057128 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057138 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057147 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057156 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057165 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057174 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057182 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057193 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057203 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057212 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057222 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057230 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057240 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057250 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057259 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057268 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057278 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057287 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057296 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057308 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057317 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057326 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057336 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057345 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057355 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057383 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057393 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057403 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057412 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057423 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057433 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057442 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057452 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057461 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057482 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057493 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057503 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057512 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057523 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057531 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057548 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057558 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057567 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057576 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057585 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057595 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057603 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057611 4883 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057621 4883 reconstruct.go:97] "Volume reconstruction finished" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.057627 4883 reconciler.go:26] "Reconciler: start to sync state" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.066726 4883 manager.go:324] Recovery completed Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.076012 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.076782 4883 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.078591 4883 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.078627 4883 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.078653 4883 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.078690 4883 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.079151 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.079194 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.079286 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.079319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.079330 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.080962 4883 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.080986 4883 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.081003 4883 state_mem.go:36] "Initialized new in-memory state store" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.085407 4883 policy_none.go:49] "None policy: Start" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.086622 4883 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.086656 4883 state_mem.go:35] "Initializing new in-memory state store" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128221 4883 manager.go:334] "Starting Device Plugin manager" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128255 4883 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128275 4883 server.go:79] "Starting device plugin registration server" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128573 4883 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128592 4883 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128705 4883 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128767 4883 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.128772 4883 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.134304 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.179649 4883 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.179728 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.180714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.180754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.180783 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.180968 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.181404 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.181519 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.181775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.181797 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.181824 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182020 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182206 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182266 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182766 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182797 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182854 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.182868 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.183039 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.183149 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.183185 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184136 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184207 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184803 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.184815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.185023 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.185191 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.185245 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186057 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186329 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186416 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.186459 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.187293 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.187323 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.187336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.228852 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.229650 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.229691 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.229704 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.229742 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.230307 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.140:6443: connect: connection refused" node="crc" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.244628 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="400ms" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260304 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260342 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260391 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260415 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260444 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260467 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260499 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260538 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260570 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260682 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260729 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260776 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260803 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.260856 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362342 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362409 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362450 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362494 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362592 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362597 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362611 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362598 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362686 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362756 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362681 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362786 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362830 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362720 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362880 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362929 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362943 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362971 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.362982 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.363037 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.363080 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.363242 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.430419 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.431290 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.431328 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.431337 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.431360 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.431775 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.140:6443: connect: connection refused" node="crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.511143 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.519007 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.535860 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e65951813b67f1392cd97c4fc71e6254199735494b9a718e45e2dda9ace1845b WatchSource:0}: Error finding container e65951813b67f1392cd97c4fc71e6254199735494b9a718e45e2dda9ace1845b: Status 404 returned error can't find the container with id e65951813b67f1392cd97c4fc71e6254199735494b9a718e45e2dda9ace1845b Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.537714 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ae3753fdb14ba12a232b57294fa46e7b9ee6159384a7a39ebf250b5e5000070f WatchSource:0}: Error finding container ae3753fdb14ba12a232b57294fa46e7b9ee6159384a7a39ebf250b5e5000070f: Status 404 returned error can't find the container with id ae3753fdb14ba12a232b57294fa46e7b9ee6159384a7a39ebf250b5e5000070f Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.543032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.557340 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-02a04b5f83cc7da8186c0208f83dd53e8c5b488d0faf4c771f4fa3aa3c52cdd7 WatchSource:0}: Error finding container 02a04b5f83cc7da8186c0208f83dd53e8c5b488d0faf4c771f4fa3aa3c52cdd7: Status 404 returned error can't find the container with id 02a04b5f83cc7da8186c0208f83dd53e8c5b488d0faf4c771f4fa3aa3c52cdd7 Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.569253 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.572591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.583272 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9ca9bd41f817db21e9499e79a6182c3c620cf4f1d8ec9aaf2fc48926c5efe305 WatchSource:0}: Error finding container 9ca9bd41f817db21e9499e79a6182c3c620cf4f1d8ec9aaf2fc48926c5efe305: Status 404 returned error can't find the container with id 9ca9bd41f817db21e9499e79a6182c3c620cf4f1d8ec9aaf2fc48926c5efe305 Mar 10 09:03:34 crc kubenswrapper[4883]: W0310 09:03:34.584113 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a3a1f1caa1b2eac5d657145f28ddffe1662b780cd2261d2d6860ef8f48b8fdc3 WatchSource:0}: Error finding container a3a1f1caa1b2eac5d657145f28ddffe1662b780cd2261d2d6860ef8f48b8fdc3: Status 404 returned error can't find the container with id a3a1f1caa1b2eac5d657145f28ddffe1662b780cd2261d2d6860ef8f48b8fdc3 Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.645790 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="800ms" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.832344 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.833908 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.833943 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.833957 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:34 crc kubenswrapper[4883]: I0310 09:03:34.833983 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:34 crc kubenswrapper[4883]: E0310 09:03:34.834451 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.140:6443: connect: connection refused" node="crc" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.041155 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:35 crc kubenswrapper[4883]: W0310 09:03:35.052132 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.052250 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.084052 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" exitCode=0 Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.084128 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.084303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae3753fdb14ba12a232b57294fa46e7b9ee6159384a7a39ebf250b5e5000070f"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.084460 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.085705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.085765 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.085779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.086258 4883 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d" exitCode=0 Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.086355 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.086419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e65951813b67f1392cd97c4fc71e6254199735494b9a718e45e2dda9ace1845b"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.086589 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.087654 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088147 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088184 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088195 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088260 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088282 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088364 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.088402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a3a1f1caa1b2eac5d657145f28ddffe1662b780cd2261d2d6860ef8f48b8fdc3"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.091674 4883 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88" exitCode=0 Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.091749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.091780 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9ca9bd41f817db21e9499e79a6182c3c620cf4f1d8ec9aaf2fc48926c5efe305"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.091876 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.092796 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.092825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.092838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.093401 4883 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681" exitCode=0 Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.093429 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.093447 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02a04b5f83cc7da8186c0208f83dd53e8c5b488d0faf4c771f4fa3aa3c52cdd7"} Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.093620 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.094355 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.094390 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.094400 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: W0310 09:03:35.275782 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.275869 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.447769 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="1.6s" Mar 10 09:03:35 crc kubenswrapper[4883]: W0310 09:03:35.509788 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.509872 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.635914 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.637535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.637607 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.637636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.637687 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.638163 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.140:6443: connect: connection refused" node="crc" Mar 10 09:03:35 crc kubenswrapper[4883]: W0310 09:03:35.639619 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.140:6443: connect: connection refused Mar 10 09:03:35 crc kubenswrapper[4883]: E0310 09:03:35.639701 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.140:6443: connect: connection refused" logger="UnhandledError" Mar 10 09:03:35 crc kubenswrapper[4883]: I0310 09:03:35.990587 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.098009 4883 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902" exitCode=0 Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.098095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.098237 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.099105 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.099146 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.099157 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cc7a5c53fffb10cf35fef3434af4529b1a9b8ca7a0b204a1cd086c7f056ea348"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103693 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103715 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103724 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.103824 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.104652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.104679 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.104687 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.106456 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.106551 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.107423 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.107493 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.107505 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.110182 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.110218 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.110231 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.110234 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.111173 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.111211 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.111224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.113139 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.113182 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.113202 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2"} Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.113282 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.114064 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.114095 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:36 crc kubenswrapper[4883]: I0310 09:03:36.114110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.118045 4883 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e" exitCode=0 Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.118109 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e"} Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.118169 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.118228 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123187 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123211 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123237 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123251 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.123240 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.238732 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.239840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.239874 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.239887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.239910 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.336200 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.696526 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.696651 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.697618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.697654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:37 crc kubenswrapper[4883]: I0310 09:03:37.697665 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126533 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3"} Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126574 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126590 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28"} Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126601 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0"} Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126610 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24"} Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126619 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf"} Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.126716 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127392 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127423 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127436 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127611 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127647 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.127656 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.351540 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.513584 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.513749 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.514855 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.514887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.514901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:38 crc kubenswrapper[4883]: I0310 09:03:38.603891 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.129202 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.130003 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.130030 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.130040 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.337279 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.337382 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.338498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.338528 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.338541 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.851102 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.851235 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.852117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.852155 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:39 crc kubenswrapper[4883]: I0310 09:03:39.852166 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.078858 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.130722 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.130771 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131556 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131574 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:40 crc kubenswrapper[4883]: I0310 09:03:40.131665 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:41 crc kubenswrapper[4883]: I0310 09:03:41.890832 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:41 crc kubenswrapper[4883]: I0310 09:03:41.890947 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:41 crc kubenswrapper[4883]: I0310 09:03:41.891957 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:41 crc kubenswrapper[4883]: I0310 09:03:41.892007 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:41 crc kubenswrapper[4883]: I0310 09:03:41.892017 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:44 crc kubenswrapper[4883]: E0310 09:03:44.134423 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:03:44 crc kubenswrapper[4883]: I0310 09:03:44.891382 4883 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:03:44 crc kubenswrapper[4883]: I0310 09:03:44.891487 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.188858 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.189634 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.191163 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.191207 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.191216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:45 crc kubenswrapper[4883]: I0310 09:03:45.194270 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:45 crc kubenswrapper[4883]: E0310 09:03:45.992961 4883 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.041882 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.125235 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" interval="3.2s" Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.130857 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b6f721305d50f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,LastTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.133351 4883 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.133396 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.134547 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:03:46 crc kubenswrapper[4883]: W0310 09:03:46.137604 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.137665 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.138336 4883 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.138369 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 09:03:46 crc kubenswrapper[4883]: W0310 09:03:46.140962 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.141112 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:46 crc kubenswrapper[4883]: W0310 09:03:46.142101 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.142153 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:46 crc kubenswrapper[4883]: W0310 09:03:46.143218 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z Mar 10 09:03:46 crc kubenswrapper[4883]: E0310 09:03:46.143275 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.144335 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.145106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.145132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.145141 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:46 crc kubenswrapper[4883]: I0310 09:03:46.148784 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.043349 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:47Z is after 2026-02-23T05:33:13Z Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.148055 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.149742 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cc7a5c53fffb10cf35fef3434af4529b1a9b8ca7a0b204a1cd086c7f056ea348" exitCode=255 Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.149817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cc7a5c53fffb10cf35fef3434af4529b1a9b8ca7a0b204a1cd086c7f056ea348"} Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.149889 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.149981 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150738 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150761 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150855 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150880 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.150891 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.151241 4883 scope.go:117] "RemoveContainer" containerID="cc7a5c53fffb10cf35fef3434af4529b1a9b8ca7a0b204a1cd086c7f056ea348" Mar 10 09:03:47 crc kubenswrapper[4883]: I0310 09:03:47.373024 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.043377 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:48Z is after 2026-02-23T05:33:13Z Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.154379 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.154779 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.156207 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" exitCode=255 Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.156264 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f"} Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.156334 4883 scope.go:117] "RemoveContainer" containerID="cc7a5c53fffb10cf35fef3434af4529b1a9b8ca7a0b204a1cd086c7f056ea348" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.156339 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.157350 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.157387 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.157398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.157871 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:48 crc kubenswrapper[4883]: E0310 09:03:48.159630 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.370422 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.370650 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.371710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.371771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.371791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.380496 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 09:03:48 crc kubenswrapper[4883]: I0310 09:03:48.513842 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.042710 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:49Z is after 2026-02-23T05:33:13Z Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.159667 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.161587 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.161612 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162392 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162434 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162433 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162449 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162492 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.162558 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.163031 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:49 crc kubenswrapper[4883]: E0310 09:03:49.163208 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:49 crc kubenswrapper[4883]: E0310 09:03:49.327372 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:49Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.335577 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.336354 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.336385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.336398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:49 crc kubenswrapper[4883]: I0310 09:03:49.336417 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:49 crc kubenswrapper[4883]: E0310 09:03:49.338627 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.042427 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:50Z is after 2026-02-23T05:33:13Z Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.083105 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.154104 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:03:50 crc kubenswrapper[4883]: E0310 09:03:50.156603 4883 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.163621 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.164560 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.164633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.164642 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.165107 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:50 crc kubenswrapper[4883]: E0310 09:03:50.165270 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:50 crc kubenswrapper[4883]: I0310 09:03:50.166639 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:50 crc kubenswrapper[4883]: W0310 09:03:50.524311 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:50Z is after 2026-02-23T05:33:13Z Mar 10 09:03:50 crc kubenswrapper[4883]: E0310 09:03:50.525083 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.042746 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:51Z is after 2026-02-23T05:33:13Z Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.166034 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.166986 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.167042 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.167052 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:51 crc kubenswrapper[4883]: I0310 09:03:51.167549 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:51 crc kubenswrapper[4883]: E0310 09:03:51.168044 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:51 crc kubenswrapper[4883]: W0310 09:03:51.563285 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:51Z is after 2026-02-23T05:33:13Z Mar 10 09:03:51 crc kubenswrapper[4883]: E0310 09:03:51.563369 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:03:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.044140 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.167565 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.168277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.168304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.168313 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:52 crc kubenswrapper[4883]: I0310 09:03:52.168744 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:52 crc kubenswrapper[4883]: E0310 09:03:52.168895 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:52 crc kubenswrapper[4883]: W0310 09:03:52.199336 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 09:03:52 crc kubenswrapper[4883]: E0310 09:03:52.199383 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:03:52 crc kubenswrapper[4883]: W0310 09:03:52.467321 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 09:03:52 crc kubenswrapper[4883]: E0310 09:03:52.467370 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:03:53 crc kubenswrapper[4883]: I0310 09:03:53.044686 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:54 crc kubenswrapper[4883]: I0310 09:03:54.043814 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:54 crc kubenswrapper[4883]: E0310 09:03:54.134559 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:03:54 crc kubenswrapper[4883]: I0310 09:03:54.891844 4883 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 09:03:54 crc kubenswrapper[4883]: I0310 09:03:54.892248 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.044178 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:55 crc kubenswrapper[4883]: E0310 09:03:55.731779 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.738760 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.739996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.740036 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.740046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:55 crc kubenswrapper[4883]: I0310 09:03:55.740073 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:03:55 crc kubenswrapper[4883]: E0310 09:03:55.743721 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:03:56 crc kubenswrapper[4883]: I0310 09:03:56.043755 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.135455 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721305d50f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,LastTimestamp:2026-03-10 09:03:34.036526351 +0000 UTC m=+0.291424240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.139587 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.142819 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.145585 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.148663 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7218a1acf7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.130625783 +0000 UTC m=+0.385523672,LastTimestamp:2026-03-10 09:03:34.130625783 +0000 UTC m=+0.385523672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.151899 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.180736585 +0000 UTC m=+0.435634475,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.154809 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.180777654 +0000 UTC m=+0.435675542,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.157778 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.180789477 +0000 UTC m=+0.435687365,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.160836 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.181792362 +0000 UTC m=+0.436690250,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.164039 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.181820917 +0000 UTC m=+0.436718806,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.166971 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.181830625 +0000 UTC m=+0.436728514,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.169935 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.182790559 +0000 UTC m=+0.437688447,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.172902 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.182803693 +0000 UTC m=+0.437701582,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.176332 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.182814364 +0000 UTC m=+0.437712253,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.179510 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.182842207 +0000 UTC m=+0.437740096,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.182432 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.182861945 +0000 UTC m=+0.437759835,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.185368 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.182874288 +0000 UTC m=+0.437772178,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.188295 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.184075926 +0000 UTC m=+0.438973815,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.191495 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.18409427 +0000 UTC m=+0.438992160,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.194313 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.184123096 +0000 UTC m=+0.439020986,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.197286 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.184149898 +0000 UTC m=+0.439047787,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.200262 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.184201587 +0000 UTC m=+0.439099475,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.203301 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f7215930ee7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f7215930ee7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079336167 +0000 UTC m=+0.334234056,LastTimestamp:2026-03-10 09:03:34.184214181 +0000 UTC m=+0.439112071,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.206773 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592aec8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592aec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.07931156 +0000 UTC m=+0.334209449,LastTimestamp:2026-03-10 09:03:34.184775549 +0000 UTC m=+0.439673438,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.209978 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b6f721592e557\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b6f721592e557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.079325527 +0000 UTC m=+0.334223416,LastTimestamp:2026-03-10 09:03:34.184811337 +0000 UTC m=+0.439709216,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.213541 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f723105872f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.539822895 +0000 UTC m=+0.794720784,LastTimestamp:2026-03-10 09:03:34.539822895 +0000 UTC m=+0.794720784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.216953 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72310abf6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.540164972 +0000 UTC m=+0.795062861,LastTimestamp:2026-03-10 09:03:34.540164972 +0000 UTC m=+0.795062861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.219823 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72323e13e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.560306148 +0000 UTC m=+0.815204037,LastTimestamp:2026-03-10 09:03:34.560306148 +0000 UTC m=+0.815204037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.222763 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f7233c1904d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.585700429 +0000 UTC m=+0.840598319,LastTimestamp:2026-03-10 09:03:34.585700429 +0000 UTC m=+0.840598319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.225761 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7233cbc43b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.586369083 +0000 UTC m=+0.841266972,LastTimestamp:2026-03-10 09:03:34.586369083 +0000 UTC m=+0.841266972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.229198 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f7248e4f257 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.940340823 +0000 UTC m=+1.195238712,LastTimestamp:2026-03-10 09:03:34.940340823 +0000 UTC m=+1.195238712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.232451 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7248f773df openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.941553631 +0000 UTC m=+1.196451520,LastTimestamp:2026-03-10 09:03:34.941553631 +0000 UTC m=+1.196451520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.236045 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f724903d80e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.94236571 +0000 UTC m=+1.197263599,LastTimestamp:2026-03-10 09:03:34.94236571 +0000 UTC m=+1.197263599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.239143 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f724925436c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.944555884 +0000 UTC m=+1.199453774,LastTimestamp:2026-03-10 09:03:34.944555884 +0000 UTC m=+1.199453774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.243040 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f7249613ef6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.948486902 +0000 UTC m=+1.203384791,LastTimestamp:2026-03-10 09:03:34.948486902 +0000 UTC m=+1.203384791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.246132 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7249b52974 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.95398642 +0000 UTC m=+1.208884309,LastTimestamp:2026-03-10 09:03:34.95398642 +0000 UTC m=+1.208884309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.249190 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f7249c640a5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.955106469 +0000 UTC m=+1.210004358,LastTimestamp:2026-03-10 09:03:34.955106469 +0000 UTC m=+1.210004358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.252323 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7249dcba31 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.956579377 +0000 UTC m=+1.211477267,LastTimestamp:2026-03-10 09:03:34.956579377 +0000 UTC m=+1.211477267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.255572 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7249fbd4fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.95861785 +0000 UTC m=+1.213515740,LastTimestamp:2026-03-10 09:03:34.95861785 +0000 UTC m=+1.213515740,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.259001 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f724a19d22d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.960583213 +0000 UTC m=+1.215481103,LastTimestamp:2026-03-10 09:03:34.960583213 +0000 UTC m=+1.215481103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.262143 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f724a21c614 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:34.961104404 +0000 UTC m=+1.216002293,LastTimestamp:2026-03-10 09:03:34.961104404 +0000 UTC m=+1.216002293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.265244 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7251a93eb8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.08742316 +0000 UTC m=+1.342321049,LastTimestamp:2026-03-10 09:03:35.08742316 +0000 UTC m=+1.342321049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.268558 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f7251caa315 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.089611541 +0000 UTC m=+1.344509430,LastTimestamp:2026-03-10 09:03:35.089611541 +0000 UTC m=+1.344509430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.271740 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f7252212151 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.095279953 +0000 UTC m=+1.350177842,LastTimestamp:2026-03-10 09:03:35.095279953 +0000 UTC m=+1.350177842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.274939 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f7252237ff3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.095435251 +0000 UTC m=+1.350333140,LastTimestamp:2026-03-10 09:03:35.095435251 +0000 UTC m=+1.350333140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.278261 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7258dcbdad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.208238509 +0000 UTC m=+1.463136398,LastTimestamp:2026-03-10 09:03:35.208238509 +0000 UTC m=+1.463136398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.281320 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f72594df1ff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.215657471 +0000 UTC m=+1.470555361,LastTimestamp:2026-03-10 09:03:35.215657471 +0000 UTC m=+1.470555361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.284440 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7259618240 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.216939584 +0000 UTC m=+1.471837472,LastTimestamp:2026-03-10 09:03:35.216939584 +0000 UTC m=+1.471837472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.287362 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f725a07ddca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.227841994 +0000 UTC m=+1.482739883,LastTimestamp:2026-03-10 09:03:35.227841994 +0000 UTC m=+1.482739883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.290399 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f725a15cda7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.228755367 +0000 UTC m=+1.483653256,LastTimestamp:2026-03-10 09:03:35.228755367 +0000 UTC m=+1.483653256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.293403 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b6f725a966ef6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.23718527 +0000 UTC m=+1.492083160,LastTimestamp:2026-03-10 09:03:35.23718527 +0000 UTC m=+1.492083160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.295431 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f725a966eb0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.2371852 +0000 UTC m=+1.492083089,LastTimestamp:2026-03-10 09:03:35.2371852 +0000 UTC m=+1.492083089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.296387 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f725a99fd26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.237418278 +0000 UTC m=+1.492316168,LastTimestamp:2026-03-10 09:03:35.237418278 +0000 UTC m=+1.492316168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.298894 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f725aa20d26 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.237946662 +0000 UTC m=+1.492844551,LastTimestamp:2026-03-10 09:03:35.237946662 +0000 UTC m=+1.492844551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.301853 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f725b1cb1fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.245984253 +0000 UTC m=+1.500882142,LastTimestamp:2026-03-10 09:03:35.245984253 +0000 UTC m=+1.500882142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.304681 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f725b9cfeb9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.254392505 +0000 UTC m=+1.509290395,LastTimestamp:2026-03-10 09:03:35.254392505 +0000 UTC m=+1.509290395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.307606 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f725ba73b6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.255063402 +0000 UTC m=+1.509961292,LastTimestamp:2026-03-10 09:03:35.255063402 +0000 UTC m=+1.509961292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.310515 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f725bce99fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.257643516 +0000 UTC m=+1.512541405,LastTimestamp:2026-03-10 09:03:35.257643516 +0000 UTC m=+1.512541405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.313423 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f72630c3fcd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.379124173 +0000 UTC m=+1.634022063,LastTimestamp:2026-03-10 09:03:35.379124173 +0000 UTC m=+1.634022063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.316304 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f7263131541 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.379572033 +0000 UTC m=+1.634469922,LastTimestamp:2026-03-10 09:03:35.379572033 +0000 UTC m=+1.634469922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.319160 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7263528be9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.383731177 +0000 UTC m=+1.638629067,LastTimestamp:2026-03-10 09:03:35.383731177 +0000 UTC m=+1.638629067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.322773 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f72637b30f7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.386394871 +0000 UTC m=+1.641292761,LastTimestamp:2026-03-10 09:03:35.386394871 +0000 UTC m=+1.641292761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.325806 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f72638b44ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.387448554 +0000 UTC m=+1.642346443,LastTimestamp:2026-03-10 09:03:35.387448554 +0000 UTC m=+1.642346443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.328777 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f72638edcb0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.387684016 +0000 UTC m=+1.642581905,LastTimestamp:2026-03-10 09:03:35.387684016 +0000 UTC m=+1.642581905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.331618 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f72639b5e76 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.38850367 +0000 UTC m=+1.643401559,LastTimestamp:2026-03-10 09:03:35.38850367 +0000 UTC m=+1.643401559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.338325 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7263ba0ec1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.390514881 +0000 UTC m=+1.645412770,LastTimestamp:2026-03-10 09:03:35.390514881 +0000 UTC m=+1.645412770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.341752 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7263e008b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.393003698 +0000 UTC m=+1.647901588,LastTimestamp:2026-03-10 09:03:35.393003698 +0000 UTC m=+1.647901588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.344769 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f726c5ee96b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.535536491 +0000 UTC m=+1.790434380,LastTimestamp:2026-03-10 09:03:35.535536491 +0000 UTC m=+1.790434380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.351848 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f726c6097e5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.535646693 +0000 UTC m=+1.790544582,LastTimestamp:2026-03-10 09:03:35.535646693 +0000 UTC m=+1.790544582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.355034 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b6f726cce83fa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.542850554 +0000 UTC m=+1.797748443,LastTimestamp:2026-03-10 09:03:35.542850554 +0000 UTC m=+1.797748443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.358157 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f726cd657d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.543363538 +0000 UTC m=+1.798261427,LastTimestamp:2026-03-10 09:03:35.543363538 +0000 UTC m=+1.798261427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.361166 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f726e3e66bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.566960319 +0000 UTC m=+1.821858208,LastTimestamp:2026-03-10 09:03:35.566960319 +0000 UTC m=+1.821858208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.363864 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f726eac3184 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.574155652 +0000 UTC m=+1.829053541,LastTimestamp:2026-03-10 09:03:35.574155652 +0000 UTC m=+1.829053541,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.369584 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f726eba1b51 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.575067473 +0000 UTC m=+1.829965362,LastTimestamp:2026-03-10 09:03:35.575067473 +0000 UTC m=+1.829965362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.373706 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7275ad6eeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.691677419 +0000 UTC m=+1.946575308,LastTimestamp:2026-03-10 09:03:35.691677419 +0000 UTC m=+1.946575308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.376617 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72763bc1f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.701004785 +0000 UTC m=+1.955902674,LastTimestamp:2026-03-10 09:03:35.701004785 +0000 UTC m=+1.955902674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.379772 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72764cefaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.702130607 +0000 UTC m=+1.957028495,LastTimestamp:2026-03-10 09:03:35.702130607 +0000 UTC m=+1.957028495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.382895 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72803847c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.868549065 +0000 UTC m=+2.123446944,LastTimestamp:2026-03-10 09:03:35.868549065 +0000 UTC m=+2.123446944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.385717 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7280919383 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.874401155 +0000 UTC m=+2.129299045,LastTimestamp:2026-03-10 09:03:35.874401155 +0000 UTC m=+2.129299045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.388840 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f728e0d0e8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:36.100597389 +0000 UTC m=+2.355495278,LastTimestamp:2026-03-10 09:03:36.100597389 +0000 UTC m=+2.355495278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.392150 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f7296eaa899 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:36.249338009 +0000 UTC m=+2.504235898,LastTimestamp:2026-03-10 09:03:36.249338009 +0000 UTC m=+2.504235898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.395038 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f7297617cdb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:36.257125595 +0000 UTC m=+2.512023494,LastTimestamp:2026-03-10 09:03:36.257125595 +0000 UTC m=+2.512023494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.398131 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72cb1b0f95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.124925333 +0000 UTC m=+3.379823223,LastTimestamp:2026-03-10 09:03:37.124925333 +0000 UTC m=+3.379823223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.401186 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72d24c2226 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.245581862 +0000 UTC m=+3.500479751,LastTimestamp:2026-03-10 09:03:37.245581862 +0000 UTC m=+3.500479751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.404410 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72d2b9d8e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.252772071 +0000 UTC m=+3.507669960,LastTimestamp:2026-03-10 09:03:37.252772071 +0000 UTC m=+3.507669960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.407240 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72d2c87c3a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.253731386 +0000 UTC m=+3.508629276,LastTimestamp:2026-03-10 09:03:37.253731386 +0000 UTC m=+3.508629276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.409941 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72da972cd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.384717523 +0000 UTC m=+3.639615412,LastTimestamp:2026-03-10 09:03:37.384717523 +0000 UTC m=+3.639615412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.412757 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72db0a692a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.39226961 +0000 UTC m=+3.647167500,LastTimestamp:2026-03-10 09:03:37.39226961 +0000 UTC m=+3.647167500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.415818 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72db17f6a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.393157796 +0000 UTC m=+3.648055685,LastTimestamp:2026-03-10 09:03:37.393157796 +0000 UTC m=+3.648055685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.418673 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72e22a3490 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.511793808 +0000 UTC m=+3.766691698,LastTimestamp:2026-03-10 09:03:37.511793808 +0000 UTC m=+3.766691698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.421605 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72e2969730 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.518896944 +0000 UTC m=+3.773794834,LastTimestamp:2026-03-10 09:03:37.518896944 +0000 UTC m=+3.773794834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.424622 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72e2a33b26 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.51972535 +0000 UTC m=+3.774623239,LastTimestamp:2026-03-10 09:03:37.51972535 +0000 UTC m=+3.774623239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.427386 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72eae42d50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.658199376 +0000 UTC m=+3.913097266,LastTimestamp:2026-03-10 09:03:37.658199376 +0000 UTC m=+3.913097266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.430372 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72eb5681b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.665692087 +0000 UTC m=+3.920589976,LastTimestamp:2026-03-10 09:03:37.665692087 +0000 UTC m=+3.920589976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.433011 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72eb6214e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.666450665 +0000 UTC m=+3.921348564,LastTimestamp:2026-03-10 09:03:37.666450665 +0000 UTC m=+3.921348564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.435825 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72f35c98a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.800308898 +0000 UTC m=+4.055206797,LastTimestamp:2026-03-10 09:03:37.800308898 +0000 UTC m=+4.055206797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.438889 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b6f72f3d97902 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:37.808492802 +0000 UTC m=+4.063390691,LastTimestamp:2026-03-10 09:03:37.808492802 +0000 UTC m=+4.063390691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.442857 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-controller-manager-crc.189b6f749a06c26e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 09:03:56 crc kubenswrapper[4883]: body: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:44.89144587 +0000 UTC m=+11.146343769,LastTimestamp:2026-03-10 09:03:44.89144587 +0000 UTC m=+11.146343769,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.445672 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f749a0809ba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:44.891529658 +0000 UTC m=+11.146427557,LastTimestamp:2026-03-10 09:03:44.891529658 +0000 UTC m=+11.146427557,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.448677 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40d3446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:03:56 crc kubenswrapper[4883]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:56 crc kubenswrapper[4883]: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,LastTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.451810 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40db403 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,LastTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.454683 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f74e40d3446\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40d3446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 09:03:56 crc kubenswrapper[4883]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 09:03:56 crc kubenswrapper[4883]: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133382214 +0000 UTC m=+12.388280103,LastTimestamp:2026-03-10 09:03:46.13835916 +0000 UTC m=+12.393257049,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.457314 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f74e40db403\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f74e40db403 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:46.133414915 +0000 UTC m=+12.388312804,LastTimestamp:2026-03-10 09:03:46.138385008 +0000 UTC m=+12.393282897,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.460557 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f72764cefaf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72764cefaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.702130607 +0000 UTC m=+1.957028495,LastTimestamp:2026-03-10 09:03:47.152219812 +0000 UTC m=+13.407117701,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.463582 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f72803847c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f72803847c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.868549065 +0000 UTC m=+2.123446944,LastTimestamp:2026-03-10 09:03:47.265834405 +0000 UTC m=+13.520732295,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.466575 4883 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b6f7280919383\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b6f7280919383 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:35.874401155 +0000 UTC m=+2.129299045,LastTimestamp:2026-03-10 09:03:47.271662585 +0000 UTC m=+13.526560474,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.470675 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 09:03:56 crc kubenswrapper[4883]: &Event{ObjectMeta:{kube-controller-manager-crc.189b6f76ee1e44d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 09:03:56 crc kubenswrapper[4883]: body: Mar 10 09:03:56 crc kubenswrapper[4883]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:54.892207313 +0000 UTC m=+21.147105203,LastTimestamp:2026-03-10 09:03:54.892207313 +0000 UTC m=+21.147105203,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 09:03:56 crc kubenswrapper[4883]: > Mar 10 09:03:56 crc kubenswrapper[4883]: E0310 09:03:56.473510 4883 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b6f76ee20538d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:03:54.892342157 +0000 UTC m=+21.147240046,LastTimestamp:2026-03-10 09:03:54.892342157 +0000 UTC m=+21.147240046,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.044513 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.372673 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.372864 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373885 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.373898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:03:57 crc kubenswrapper[4883]: I0310 09:03:57.374353 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:03:57 crc kubenswrapper[4883]: E0310 09:03:57.374531 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.043595 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:58 crc kubenswrapper[4883]: W0310 09:03:58.075390 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:58 crc kubenswrapper[4883]: E0310 09:03:58.075461 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.369757 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 09:03:58 crc kubenswrapper[4883]: I0310 09:03:58.381400 4883 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:03:59 crc kubenswrapper[4883]: I0310 09:03:59.043965 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:03:59 crc kubenswrapper[4883]: W0310 09:03:59.314264 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 09:03:59 crc kubenswrapper[4883]: E0310 09:03:59.314321 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:00 crc kubenswrapper[4883]: I0310 09:04:00.043298 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: I0310 09:04:01.043427 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: W0310 09:04:01.273431 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 09:04:01 crc kubenswrapper[4883]: E0310 09:04:01.273514 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.043397 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.177579 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.177737 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.178719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.181070 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.189746 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190546 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.190594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: W0310 09:04:02.705874 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.705926 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.736238 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.744101 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745306 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:02 crc kubenswrapper[4883]: I0310 09:04:02.745338 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:02 crc kubenswrapper[4883]: E0310 09:04:02.749515 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:03 crc kubenswrapper[4883]: I0310 09:04:03.050538 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:04 crc kubenswrapper[4883]: I0310 09:04:04.044341 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:04 crc kubenswrapper[4883]: E0310 09:04:04.134655 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:05 crc kubenswrapper[4883]: I0310 09:04:05.041387 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:06 crc kubenswrapper[4883]: I0310 09:04:06.044197 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:07 crc kubenswrapper[4883]: I0310 09:04:07.044318 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:08 crc kubenswrapper[4883]: I0310 09:04:08.044825 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.043867 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:09 crc kubenswrapper[4883]: E0310 09:04:09.739990 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.750225 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751390 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751435 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751449 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:09 crc kubenswrapper[4883]: I0310 09:04:09.751498 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:09 crc kubenswrapper[4883]: E0310 09:04:09.754554 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:10 crc kubenswrapper[4883]: I0310 09:04:10.043748 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.043362 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.079868 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.080825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.081250 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.214034 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.215352 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13"} Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.215463 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:11 crc kubenswrapper[4883]: I0310 09:04:11.216117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.043726 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.218704 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.219189 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220804 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" exitCode=255 Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13"} Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.220871 4883 scope.go:117] "RemoveContainer" containerID="c7b4f664df3f11468d9d8594d164f3449f4a5c330b0a3cfc966a0a710814ac2f" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.221075 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226350 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.226407 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:12 crc kubenswrapper[4883]: I0310 09:04:12.227043 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:12 crc kubenswrapper[4883]: E0310 09:04:12.227915 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:13 crc kubenswrapper[4883]: I0310 09:04:13.041777 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:13 crc kubenswrapper[4883]: I0310 09:04:13.225906 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:14 crc kubenswrapper[4883]: I0310 09:04:14.043654 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:14 crc kubenswrapper[4883]: E0310 09:04:14.134783 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:15 crc kubenswrapper[4883]: I0310 09:04:15.043531 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: W0310 09:04:15.691990 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: E0310 09:04:15.692042 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:15 crc kubenswrapper[4883]: W0310 09:04:15.780808 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 09:04:15 crc kubenswrapper[4883]: E0310 09:04:15.780839 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.043831 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:16 crc kubenswrapper[4883]: E0310 09:04:16.743543 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.755680 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756903 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756914 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:16 crc kubenswrapper[4883]: I0310 09:04:16.756940 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:16 crc kubenswrapper[4883]: E0310 09:04:16.760057 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.043612 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.373177 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.373366 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374285 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:17 crc kubenswrapper[4883]: I0310 09:04:17.374714 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:17 crc kubenswrapper[4883]: E0310 09:04:17.374863 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:17 crc kubenswrapper[4883]: W0310 09:04:17.684528 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:17 crc kubenswrapper[4883]: E0310 09:04:17.684575 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.043652 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.514503 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.514651 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.515831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:18 crc kubenswrapper[4883]: I0310 09:04:18.516383 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:18 crc kubenswrapper[4883]: E0310 09:04:18.516594 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:19 crc kubenswrapper[4883]: I0310 09:04:19.047680 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:20 crc kubenswrapper[4883]: I0310 09:04:20.044408 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: I0310 09:04:21.044177 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: W0310 09:04:21.518528 4883 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 09:04:21 crc kubenswrapper[4883]: E0310 09:04:21.518579 4883 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 09:04:22 crc kubenswrapper[4883]: I0310 09:04:22.043579 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.043314 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:23 crc kubenswrapper[4883]: E0310 09:04:23.746448 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.760694 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:23 crc kubenswrapper[4883]: I0310 09:04:23.761811 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:23 crc kubenswrapper[4883]: E0310 09:04:23.765032 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:24 crc kubenswrapper[4883]: I0310 09:04:24.044317 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:24 crc kubenswrapper[4883]: E0310 09:04:24.135308 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:25 crc kubenswrapper[4883]: I0310 09:04:25.043764 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:26 crc kubenswrapper[4883]: I0310 09:04:26.044161 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.043916 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.700130 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.700333 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701332 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701362 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:27 crc kubenswrapper[4883]: I0310 09:04:27.701396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:28 crc kubenswrapper[4883]: I0310 09:04:28.043998 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:29 crc kubenswrapper[4883]: I0310 09:04:29.046692 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.043927 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:30 crc kubenswrapper[4883]: E0310 09:04:30.749995 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.766084 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:30 crc kubenswrapper[4883]: I0310 09:04:30.767111 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:30 crc kubenswrapper[4883]: E0310 09:04:30.770727 4883 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.043297 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.079119 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.079988 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080024 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:31 crc kubenswrapper[4883]: I0310 09:04:31.080397 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:31 crc kubenswrapper[4883]: E0310 09:04:31.080578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:32 crc kubenswrapper[4883]: I0310 09:04:32.044033 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:33 crc kubenswrapper[4883]: I0310 09:04:33.044229 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:34 crc kubenswrapper[4883]: I0310 09:04:34.043822 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:34 crc kubenswrapper[4883]: E0310 09:04:34.136340 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:35 crc kubenswrapper[4883]: I0310 09:04:35.044237 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:36 crc kubenswrapper[4883]: I0310 09:04:36.044458 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.043623 4883 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.402326 4883 csr.go:261] certificate signing request csr-j5rsq is approved, waiting to be issued Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.408033 4883 csr.go:257] certificate signing request csr-j5rsq is issued Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.477880 4883 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.771282 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773029 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.773251 4883 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.779982 4883 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.780244 4883 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.780266 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783144 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783213 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.783260 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.793682 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799237 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799279 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.799307 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.806074 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811663 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.811670 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.818908 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823853 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.823889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:37Z","lastTransitionTime":"2026-03-10T09:04:37Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832073 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:37Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832185 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.832214 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: E0310 09:04:37.932688 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:37 crc kubenswrapper[4883]: I0310 09:04:37.963296 4883 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.033280 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.133773 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.234752 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.335196 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: I0310 09:04:38.409406 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 02:37:17.612067211 +0000 UTC Mar 10 09:04:38 crc kubenswrapper[4883]: I0310 09:04:38.409462 4883 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6665h32m39.20260855s for next certificate rotation Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.435821 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.536520 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.637078 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.737880 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.838636 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:38 crc kubenswrapper[4883]: E0310 09:04:38.939407 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.039979 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.140752 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.241279 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.341667 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.442790 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.542870 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.643591 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.744393 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.845227 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:39 crc kubenswrapper[4883]: E0310 09:04:39.946101 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.046625 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.147610 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.248039 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.349033 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.449954 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.551068 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.651938 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.752452 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.853426 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:40 crc kubenswrapper[4883]: E0310 09:04:40.954262 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.054970 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.155387 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.256141 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.356994 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.457931 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.558247 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.659082 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.759633 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.860740 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:41 crc kubenswrapper[4883]: E0310 09:04:41.960910 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.061383 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.162274 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.263405 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.363810 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.464527 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.565117 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.665526 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.766218 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.867138 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:42 crc kubenswrapper[4883]: E0310 09:04:42.968133 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.068756 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.169166 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.269594 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.370665 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.471566 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.572503 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.673396 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.774186 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.874825 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:43 crc kubenswrapper[4883]: E0310 09:04:43.975881 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.076882 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.136613 4883 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.177789 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.278626 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.379411 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.480213 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.580275 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.681423 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.781925 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.883024 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:44 crc kubenswrapper[4883]: E0310 09:04:44.983314 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.079164 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080363 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.080409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.081143 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.083722 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.184625 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.285611 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.307350 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.308994 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a"} Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.309207 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310144 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:45 crc kubenswrapper[4883]: I0310 09:04:45.310188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.385956 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.486847 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.587939 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.688724 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.789833 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.890508 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:45 crc kubenswrapper[4883]: E0310 09:04:45.990570 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.091433 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.192346 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.292988 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.313111 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.313611 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315233 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" exitCode=255 Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315345 4883 scope.go:117] "RemoveContainer" containerID="a219fce9b78eb618b986508c4c67b30c618f0ead336b77d1e89589a34a2d4f13" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.315498 4883 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316490 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.316531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.317137 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.317301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.393526 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: E0310 09:04:46.493639 4883 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.576001 4883 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.595969 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596012 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596023 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.596053 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698183 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.698223 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800151 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.800190 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902255 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902286 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:46 crc kubenswrapper[4883]: I0310 09:04:46.902348 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:46Z","lastTransitionTime":"2026-03-10T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003977 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003986 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.003996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.004004 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.071161 4883 apiserver.go:52] "Watching apiserver" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.077009 4883 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.078425 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7xb47","openshift-network-diagnostics/network-check-target-xd92c","openshift-multus/multus-p898z","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-node-pzdml","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9","openshift-image-registry/node-ca-vvbjw","openshift-machine-config-operator/machine-config-daemon-zxzn8","openshift-multus/multus-additional-cni-plugins-nrzgf","openshift-multus/network-metrics-daemon-gmq5n","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.078894 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079094 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.079229 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.079217 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.080054 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080609 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080620 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.080747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081788 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081915 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081946 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.081985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082022 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.082313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.082688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.082820 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.084718 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.084884 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085200 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085418 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085507 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085598 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.085606 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086022 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086316 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086374 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086560 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086583 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086727 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.086972 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087090 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087185 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087365 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087511 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087527 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087553 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087602 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087607 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087624 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087633 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087638 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087733 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087748 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087752 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087764 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087751 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087813 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087847 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.087848 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088069 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088132 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.088243 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.099598 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.108888 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109146 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.109216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.116348 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.122978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.132605 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.134780 4883 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.141296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.144284 4883 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.149082 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.161579 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.171327 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.181442 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.192257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.199831 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.210188 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.211850 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.222540 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.230837 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.238551 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239763 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.239945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240037 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240118 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240199 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240264 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240327 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240119 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240489 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240299 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240138 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240713 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240765 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240788 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.240947 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241084 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241221 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241345 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241407 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241467 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241178 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241224 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241539 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241600 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241703 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241708 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241764 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241796 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241896 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.241928 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242159 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242365 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242437 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242384 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242551 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242565 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242663 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242597 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242621 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242765 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242789 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242839 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242857 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242873 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242904 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242922 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242937 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242952 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242981 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.242998 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243018 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243032 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243049 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243080 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243105 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243121 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243138 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243154 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243186 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243202 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243245 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243266 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243285 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243303 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243320 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243344 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243379 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243396 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243412 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243435 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243452 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243468 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243522 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243541 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243562 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243595 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243597 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243612 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243617 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243628 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243646 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243663 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243680 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243696 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243713 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243729 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243746 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243763 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243796 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243811 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243845 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243877 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243910 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243920 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243927 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.243924 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244058 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244083 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244103 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244135 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244278 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244376 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244391 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244642 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244748 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244766 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244782 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244797 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244814 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244828 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244843 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244860 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244878 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244898 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244914 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244967 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.244999 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245070 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245085 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245116 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245130 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245143 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245924 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245948 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245988 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246006 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246040 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246078 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246207 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246244 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246277 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246327 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246359 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246412 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246427 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249391 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245503 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245594 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245823 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.245853 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246428 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.246447 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.746430164 +0000 UTC m=+74.001328052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250165 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250196 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250217 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250235 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250278 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250296 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250311 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250432 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250498 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250531 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250578 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250597 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250616 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250631 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250652 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250669 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250685 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250719 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250758 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250774 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250791 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250842 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250864 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250879 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250896 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250972 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250988 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251006 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251022 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251039 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251072 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251089 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251108 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251182 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251208 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251249 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251269 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251285 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251301 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251320 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251339 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251355 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251371 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251403 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251423 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251438 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251453 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251470 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251500 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251515 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251557 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251573 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251643 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251658 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251696 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251745 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251800 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251817 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251835 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251852 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251887 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251904 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251922 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251937 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251953 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252181 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252215 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252292 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252332 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252349 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252383 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252401 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252417 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252450 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252464 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252526 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252559 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252573 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252590 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252604 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252638 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252655 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252675 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252708 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252723 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252824 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252837 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252847 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252856 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252866 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252898 4883 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252908 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252918 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252929 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252938 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252948 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252967 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252976 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252985 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252995 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253007 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253017 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253027 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253037 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253048 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253059 4883 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253070 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253079 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253089 4883 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253102 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253111 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253120 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253130 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253139 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253149 4883 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253157 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253166 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253176 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253184 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253194 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253203 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253212 4883 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253221 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253230 4883 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253240 4883 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253249 4883 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253259 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253268 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253278 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254395 4883 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255695 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258014 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250861 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246547 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246843 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246820 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247353 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247432 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247438 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247522 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247534 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247697 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247808 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.247839 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248117 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248145 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248357 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248413 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248501 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248724 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248715 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.248868 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249059 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249070 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249718 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249847 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249906 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.249917 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250066 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.250900 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251108 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251502 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251535 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251648 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251683 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.251928 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252270 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259500 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.252963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253358 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253378 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253512 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253547 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253743 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253758 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253916 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.253944 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254139 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254260 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254280 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254607 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254880 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.254904 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.255020 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.259743 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.7597138 +0000 UTC m=+74.014611690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255289 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255330 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255569 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.255366 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.256530 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.259845 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.75982446 +0000 UTC m=+74.014722349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259924 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.246537 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257584 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257503 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257297 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258336 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258464 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.258757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259327 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.259359 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.257681 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260217 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260234 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260326 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260342 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.260802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.262428 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264806 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264828 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264839 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.264861 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.264874 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.764865249 +0000 UTC m=+74.019763138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265172 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265177 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265222 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265232 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.265260 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.765252312 +0000 UTC m=+74.020150191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265363 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265596 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.265885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.266891 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267857 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267916 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.267997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268071 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268279 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268456 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268604 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268777 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268845 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268878 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275364 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269860 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268236 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.268885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269065 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.269340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270671 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.270867 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275753 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275886 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275930 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275952 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.275983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276122 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276182 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.276377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.277546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.277869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278018 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278142 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278281 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278420 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.278686 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.279035 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280230 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280259 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.280324 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.283051 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.288492 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.290691 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.297334 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.298797 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315331 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.315363 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.319323 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353665 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353687 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/99873383-15b6-42ee-a65f-7917294d2e02-rootfs\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353749 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353710 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-etc-kubernetes\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353838 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353885 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353918 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353937 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.353953 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354010 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354035 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354058 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354081 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354123 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354148 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-k8s-cni-cncf-io\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354174 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354199 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354265 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354271 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354319 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-cnibin\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354368 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-os-release\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354376 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-bin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354412 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354539 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354561 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354605 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354625 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354766 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354822 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53ffac75-0989-4945-915d-4aacec270cdb-host\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354842 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354891 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354900 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354923 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354976 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.354999 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355020 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355041 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355064 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355078 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355086 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355104 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-netns\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355107 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355129 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355150 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355166 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355181 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355210 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355225 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355252 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-run-multus-certs\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355260 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-system-cni-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355306 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355312 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355330 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355346 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355384 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-cnibin\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355430 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355455 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-hostroot\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355464 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355593 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355604 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355614 4883 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355623 4883 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355632 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355641 4883 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355650 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355658 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355668 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355678 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355687 4883 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355696 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355708 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355717 4883 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355726 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355735 4883 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355745 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355754 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355763 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-conf-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355772 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c845e62-37a1-473c-a4d0-a354594903bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355817 4883 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355847 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-socket-dir-parent\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355866 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355874 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-kubelet\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355881 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355894 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355920 4883 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355938 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355952 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355974 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355975 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355986 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355998 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356000 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-multus-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356009 4883 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356025 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-hosts-file\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.354867 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356046 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-system-cni-dir\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355430 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-os-release\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.355908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356195 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e883c29-520e-4b1f-b49c-3df10450d467-host-var-lib-cni-multus\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.356220 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:47.856187862 +0000 UTC m=+74.111085741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356629 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5fd36c79-e84e-49aa-97b9-616563193cd2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356788 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356807 4883 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356826 4883 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356840 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356852 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356869 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356882 4883 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356894 4883 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356906 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356920 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356935 4883 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356950 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356974 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.356989 4883 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357003 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357018 4883 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357028 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357041 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357054 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-multus-daemon-config\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357027 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357073 4883 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357184 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357202 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357215 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357227 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357241 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357255 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357265 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357277 4883 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357291 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357312 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357323 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357337 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357348 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357360 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357373 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357391 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357402 4883 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357412 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357423 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357433 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357441 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357452 4883 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357461 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357488 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357499 4883 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357511 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357522 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357531 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357541 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357551 4883 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357559 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357569 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357579 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357588 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357630 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357640 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357650 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357659 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357667 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357677 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c845e62-37a1-473c-a4d0-a354594903bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357688 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.357884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99873383-15b6-42ee-a65f-7917294d2e02-mcd-auth-proxy-config\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e883c29-520e-4b1f-b49c-3df10450d467-cni-binary-copy\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358580 4883 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358605 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358618 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358631 4883 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358642 4883 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358653 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358664 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358675 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358685 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358696 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358706 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358719 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358730 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358745 4883 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358757 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358769 4883 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358783 4883 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358779 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5fd36c79-e84e-49aa-97b9-616563193cd2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358793 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358806 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358818 4883 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358831 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358841 4883 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358851 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358864 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358874 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358884 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358894 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358907 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358919 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358930 4883 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358941 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358951 4883 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358970 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358982 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.358992 4883 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359002 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359012 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359022 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359032 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359043 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359052 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359061 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359070 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359082 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359091 4883 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359102 4883 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359113 4883 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359122 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359132 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359143 4883 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359152 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359154 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53ffac75-0989-4945-915d-4aacec270cdb-serviceca\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359161 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359203 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359222 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359232 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359243 4883 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359254 4883 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.359266 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.366578 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99873383-15b6-42ee-a65f-7917294d2e02-proxy-tls\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.367218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.369012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdjf\" (UniqueName: \"kubernetes.io/projected/6c845e62-37a1-473c-a4d0-a354594903bc-kube-api-access-9zdjf\") pod \"multus-additional-cni-plugins-nrzgf\" (UID: \"6c845e62-37a1-473c-a4d0-a354594903bc\") " pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.370866 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqn66\" (UniqueName: \"kubernetes.io/projected/d41077f5-9f66-4be5-bb1a-e0f5b2b078e0-kube-api-access-fqn66\") pod \"node-resolver-7xb47\" (UID: \"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\") " pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.371571 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsr4f\" (UniqueName: \"kubernetes.io/projected/53ffac75-0989-4945-915d-4aacec270cdb-kube-api-access-qsr4f\") pod \"node-ca-vvbjw\" (UID: \"53ffac75-0989-4945-915d-4aacec270cdb\") " pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.371777 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"ovnkube-node-pzdml\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372042 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2lkr\" (UniqueName: \"kubernetes.io/projected/5fd36c79-e84e-49aa-97b9-616563193cd2-kube-api-access-v2lkr\") pod \"ovnkube-control-plane-749d76644c-x7sm9\" (UID: \"5fd36c79-e84e-49aa-97b9-616563193cd2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vnf\" (UniqueName: \"kubernetes.io/projected/bd6597a3-f861-4126-933e-d6134c8bd4b5-kube-api-access-64vnf\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.372490 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.373075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wszn\" (UniqueName: \"kubernetes.io/projected/8e883c29-520e-4b1f-b49c-3df10450d467-kube-api-access-2wszn\") pod \"multus-p898z\" (UID: \"8e883c29-520e-4b1f-b49c-3df10450d467\") " pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.373816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nsm\" (UniqueName: \"kubernetes.io/projected/99873383-15b6-42ee-a65f-7917294d2e02-kube-api-access-58nsm\") pod \"machine-config-daemon-zxzn8\" (UID: \"99873383-15b6-42ee-a65f-7917294d2e02\") " pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383259 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383594 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.383798 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.383812 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.390081 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.392365 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.397359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.398421 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.404605 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.407082 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.412116 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:04:47 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:04:47 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:04:47 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:04:47 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:04:47 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:47 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:04:47 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:04:47 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.412213 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a WatchSource:0}: Error finding container 5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a: Status 404 returned error can't find the container with id 5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.414831 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:04:47 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.414933 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.416161 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.416564 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:04:47 crc kubenswrapper[4883]: else Mar 10 09:04:47 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:04:47 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.416762 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.416838 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421 WatchSource:0}: Error finding container f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421: Status 404 returned error can't find the container with id f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.417981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418010 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418019 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.418048 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.418198 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.421719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.426541 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.427716 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.428783 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.431707 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvbjw" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.432469 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:04:47 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:04:47 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:04:47 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:04:47 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:04:47 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:04:47 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:04:47 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:04:47 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:04:47 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:04:47 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: } Mar 10 09:04:47 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:04:47 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 5 Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:04:47 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:04:47 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:04:47 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:04:47 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.432991 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.435338 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:47 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:04:47 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:04:47 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:04:47 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:04:47 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:04:47 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:47 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:04:47 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:04:47 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:04:47 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.436864 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.438021 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7xb47" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.440268 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc928c48_1df8_4c31_986e_eba2aa7a1c0b.slice/crio-f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b WatchSource:0}: Error finding container f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b: Status 404 returned error can't find the container with id f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.442175 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:47 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:47 crc kubenswrapper[4883]: clusters: Mar 10 09:04:47 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:47 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:47 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:47 crc kubenswrapper[4883]: contexts: Mar 10 09:04:47 crc kubenswrapper[4883]: - context: Mar 10 09:04:47 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:47 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:47 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:47 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:47 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:47 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:47 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:47 crc kubenswrapper[4883]: users: Mar 10 09:04:47 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:47 crc kubenswrapper[4883]: user: Mar 10 09:04:47 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:47 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:47 crc kubenswrapper[4883]: EOF Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.443381 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.444396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p898z" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.444818 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.447285 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c845e62_37a1_473c_a4d0_a354594903bc.slice/crio-64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f WatchSource:0}: Error finding container 64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f: Status 404 returned error can't find the container with id 64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.448139 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.448646 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ffac75_0989_4945_915d_4aacec270cdb.slice/crio-78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092 WatchSource:0}: Error finding container 78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092: Status 404 returned error can't find the container with id 78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092 Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.451621 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.451671 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:04:47 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:04:47 crc kubenswrapper[4883]: do Mar 10 09:04:47 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:04:47 crc kubenswrapper[4883]: echo $f Mar 10 09:04:47 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:04:47 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:04:47 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:04:47 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:04:47 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: else Mar 10 09:04:47 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:04:47 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:04:47 crc kubenswrapper[4883]: echo $d Mar 10 09:04:47 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:04:47 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:04:47 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:04:47 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.452905 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.453412 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.453623 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.459780 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41077f5_9f66_4be5_bb1a_e0f5b2b078e0.slice/crio-96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7 WatchSource:0}: Error finding container 96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7: Status 404 returned error can't find the container with id 96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7 Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.459766 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.460695 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e883c29_520e_4b1f_b49c_3df10450d467.slice/crio-61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a WatchSource:0}: Error finding container 61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a: Status 404 returned error can't find the container with id 61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.461911 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:47 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:04:47 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:04:47 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:04:47 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:04:47 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:04:47 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: while true; do Mar 10 09:04:47 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:04:47 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:04:47 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:04:47 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:04:47 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:04:47 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:04:47 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:47 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:04:47 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:04:47 crc kubenswrapper[4883]: do Mar 10 09:04:47 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:04:47 crc kubenswrapper[4883]: break Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:04:47 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:04:47 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:04:47 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:04:47 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: continue Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:04:47 crc kubenswrapper[4883]: rc=0 Mar 10 09:04:47 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:04:47 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:04:47 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: continue Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: Mar 10 09:04:47 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:04:47 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:04:47 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:04:47 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:04:47 crc kubenswrapper[4883]: fi Mar 10 09:04:47 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:47 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:04:47 crc kubenswrapper[4883]: done Mar 10 09:04:47 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.463226 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.466335 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:47 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:47 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:47 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:47 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.466604 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: W0310 09:04:47.467932 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3 WatchSource:0}: Error finding container 33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3: Status 404 returned error can't find the container with id 33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3 Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.467928 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.469977 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.471869 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.473114 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.473241 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.479942 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.487240 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.493864 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520651 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.520660 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623860 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.623873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.726967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.727071 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763169 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763142835 +0000 UTC m=+75.018040724 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.763219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763232 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763277 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763268733 +0000 UTC m=+75.018166622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763359 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.763390 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.763383039 +0000 UTC m=+75.018280928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829390 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.829490 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864222 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.864294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864423 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864444 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864463 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864510 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864467043 +0000 UTC m=+75.119364932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864536 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864575 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864566492 +0000 UTC m=+75.119464381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864593 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864631 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864650 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: E0310 09:04:47.864727 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:48.864705024 +0000 UTC m=+75.119602923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933816 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933830 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:47 crc kubenswrapper[4883]: I0310 09:04:47.933873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:47Z","lastTransitionTime":"2026-03-10T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036267 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.036384 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.083387 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.083946 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.085289 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.085893 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.086852 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.087361 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.087937 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.089062 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.089706 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.090587 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.091074 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.092089 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.092605 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.093121 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.094041 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.094611 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.095533 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.095958 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.096529 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.097564 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.098034 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.098986 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.099462 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.100338 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.100819 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.101412 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.102018 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.102458 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103036 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103512 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.103937 4883 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.104039 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.105210 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.105710 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.106137 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.107126 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.107721 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.108234 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.111197 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.111785 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.112532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.113061 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.114579 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.115246 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.115784 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.116343 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.116891 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.117817 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.118290 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.118970 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.119416 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.120129 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.120672 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.121186 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138401 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138451 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.138514 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.172319 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.182084 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.184985 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185030 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185055 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.185065 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.193343 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.196106 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.204717 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210176 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.210187 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.217349 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220648 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220667 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.220680 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.228272 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.228408 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240722 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.240806 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.324556 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"71e009dc87367e50e041c8e8374a3628343780d0090eaf365ffc8a14120a7616"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.325707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"61743c07d965f6f35c6e99cfd906e13f0cbea719515486ae16105f8f6e775f1a"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.327520 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:04:48 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:04:48 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:04:48 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:04:48 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:04:48 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:04:48 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:04:48 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:04:48 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:04:48 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:04:48 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: } Mar 10 09:04:48 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:04:48 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 5 Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:04:48 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:04:48 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:04:48 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.328368 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:48 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:48 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.328617 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fd8779fcd7ff3888091c6737bdc03e284aa3701dba472f074512d524156462a"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.329504 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.329748 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:04:48 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:04:48 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:04:48 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:04:48 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:48 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:04:48 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.330203 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4b1c424b6315ca81459f7e78a4734f9c1c18842d33c51f3b8914a2bc431288d4"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.330868 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.331381 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:04:48 crc kubenswrapper[4883]: else Mar 10 09:04:48 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:04:48 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.331786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"33a804d063d73483c9020989e32442f359166c9020493396af4fcc6f7f1b75a3"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332124 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:04:48 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:04:48 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:04:48 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:04:48 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:04:48 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:04:48 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:04:48 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:04:48 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:04:48 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332495 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.332732 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.332839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbjw" event={"ID":"53ffac75-0989-4945-915d-4aacec270cdb","Type":"ContainerStarted","Data":"78f55b0a6b2a69803ecc700c2616b78125740f0ea83c2e5c3fc997f67b2cc092"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.333936 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f092ec2562855ff5d74a3df6d9eeaca7a2693347921d1cfef0225bbc53f8f421"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.333975 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: set -o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:04:48 crc kubenswrapper[4883]: set +o allexport Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:04:48 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:04:48 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:04:48 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:04:48 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.334190 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:04:48 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:04:48 crc kubenswrapper[4883]: do Mar 10 09:04:48 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:04:48 crc kubenswrapper[4883]: echo $f Mar 10 09:04:48 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:04:48 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:04:48 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:04:48 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:04:48 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: else Mar 10 09:04:48 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:04:48 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:04:48 crc kubenswrapper[4883]: echo $d Mar 10 09:04:48 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:04:48 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:04:48 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:04:48 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.334989 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerStarted","Data":"64d12c99c8f963e10751c69bc204dad6e6e3c7e21a25f2f3dda2ce8f8b5d812f"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.334990 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335183 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335239 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335342 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.335677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.335880 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336615 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.336896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7xb47" event={"ID":"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0","Type":"ContainerStarted","Data":"96f58adf2a741bc4cec321b2a3f65b1f75163f5b5e28eb2df89c426b3e6c8fe7"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.336971 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.338425 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b"} Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.340101 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:04:48 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:04:48 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:04:48 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:04:48 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:04:48 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:04:48 crc kubenswrapper[4883]: exit 1 Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: while true; do Mar 10 09:04:48 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:04:48 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:04:48 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:04:48 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:04:48 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:04:48 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:04:48 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:04:48 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:04:48 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:04:48 crc kubenswrapper[4883]: do Mar 10 09:04:48 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:04:48 crc kubenswrapper[4883]: break Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:04:48 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:04:48 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:04:48 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:04:48 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: continue Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:04:48 crc kubenswrapper[4883]: rc=0 Mar 10 09:04:48 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:04:48 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:04:48 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: continue Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: Mar 10 09:04:48 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:04:48 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:04:48 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:04:48 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:04:48 crc kubenswrapper[4883]: fi Mar 10 09:04:48 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:04:48 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:04:48 crc kubenswrapper[4883]: done Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.340330 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.340754 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.342120 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.342497 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:48 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:48 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:48 crc kubenswrapper[4883]: clusters: Mar 10 09:04:48 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:48 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:48 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:48 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:48 crc kubenswrapper[4883]: contexts: Mar 10 09:04:48 crc kubenswrapper[4883]: - context: Mar 10 09:04:48 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:48 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:48 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:48 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:48 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:48 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:48 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:48 crc kubenswrapper[4883]: users: Mar 10 09:04:48 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:48 crc kubenswrapper[4883]: user: Mar 10 09:04:48 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:48 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:48 crc kubenswrapper[4883]: EOF Mar 10 09:04:48 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:48 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.343783 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.344982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.345096 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.347675 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.355851 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.367734 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.377506 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.383677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.391245 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.401754 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.414565 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.422048 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.430509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.438842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.445753 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.447889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.452514 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.460211 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.467338 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.473147 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.481915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.488038 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.496225 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.505229 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.511504 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.513729 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.519795 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.526721 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.534180 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.541142 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.550580 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.553716 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.561273 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.569490 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.577309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653095 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653155 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.653205 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755726 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.755757 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773546 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773710 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.773690134 +0000 UTC m=+77.028588023 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773783 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.773813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773950 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.774007 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.774000781 +0000 UTC m=+77.028898670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.773952 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.774136 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.774118174 +0000 UTC m=+77.029016052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858416 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858468 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.858520 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.874955 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.875032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.875062 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875123 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875215 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875194584 +0000 UTC m=+77.130092473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875224 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875248 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875262 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875269 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875293 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875309 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875331 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875315012 +0000 UTC m=+77.130212911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: E0310 09:04:48.875378 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:50.875357562 +0000 UTC m=+77.130255441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962603 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:48 crc kubenswrapper[4883]: I0310 09:04:48.962627 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:48Z","lastTransitionTime":"2026-03-10T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064858 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.064929 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.079946 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080092 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080222 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080279 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.080501 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080756 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.080799 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166800 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.166835 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269874 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.269903 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.341586 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:04:49 crc kubenswrapper[4883]: E0310 09:04:49.342062 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.371999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372035 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.372080 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474666 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.474767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577506 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.577592 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679407 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679418 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.679442 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.781272 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883575 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.883646 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.922502 4883 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985801 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:49 crc kubenswrapper[4883]: I0310 09:04:49.985833 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:49Z","lastTransitionTime":"2026-03-10T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087696 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087744 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087759 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.087770 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189948 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189957 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.189968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.190008 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292086 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.292117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394547 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.394581 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496111 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496140 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496162 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.496172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598441 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.598548 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.700528 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792677 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.792816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793566 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793613 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.792914734 +0000 UTC m=+81.047812613 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793641 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793663 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.79365207 +0000 UTC m=+81.048549959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.793720 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.793696102 +0000 UTC m=+81.048594002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803207 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.803237 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894106 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.894130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894254 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894303 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894316 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894397 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894376654 +0000 UTC m=+81.149274553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894263 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894463 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894321 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894544 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894530536 +0000 UTC m=+81.149428414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894493 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: E0310 09:04:50.894580 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:04:54.894575069 +0000 UTC m=+81.149472958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905497 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:50 crc kubenswrapper[4883]: I0310 09:04:50.905560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:50Z","lastTransitionTime":"2026-03-10T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007044 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007127 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.007139 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079405 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079424 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079446 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.079531 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079615 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079733 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079836 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:51 crc kubenswrapper[4883]: E0310 09:04:51.079891 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.108895 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211663 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.211672 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313774 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.313836 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415152 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415179 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415194 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.415205 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516874 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.516928 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618491 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618520 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618530 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.618549 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721173 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721219 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.721240 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823084 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.823131 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925014 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925083 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925105 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:51 crc kubenswrapper[4883]: I0310 09:04:51.925127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:51Z","lastTransitionTime":"2026-03-10T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027325 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027344 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.027358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129231 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.129264 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230912 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230928 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.230939 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333806 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333833 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.333846 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435820 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435843 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.435859 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538413 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538568 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.538760 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640638 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640689 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640720 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.640908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743978 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.743993 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.845771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948101 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948140 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948165 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:52 crc kubenswrapper[4883]: I0310 09:04:52.948178 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:52Z","lastTransitionTime":"2026-03-10T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.050592 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.078875 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079008 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079105 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079165 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.079210 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079220 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079296 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:53 crc kubenswrapper[4883]: E0310 09:04:53.079446 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153372 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153420 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.153428 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.254981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255040 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255052 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.255062 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356847 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356856 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356871 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.356882 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458810 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458821 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.458845 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.560741 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663124 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663158 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663178 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.663186 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.765269 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867539 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.867560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969353 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969389 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:53 crc kubenswrapper[4883]: I0310 09:04:53.969420 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:53Z","lastTransitionTime":"2026-03-10T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072104 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072147 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072173 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.072188 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.089901 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.097303 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.110966 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.121088 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.127313 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.135299 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.144601 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.153046 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.160702 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.172830 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174311 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174335 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.174346 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.181635 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.188696 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.194095 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.199907 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.206277 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276293 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.276323 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378657 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378728 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.378761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.481117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583622 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583637 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.583677 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686583 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.686604 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788679 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.788697 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828344 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.828367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828508 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828458436 +0000 UTC m=+89.083356325 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828518 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828545 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828566 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828557603 +0000 UTC m=+89.083455493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.828587 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.828574105 +0000 UTC m=+89.083471994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890231 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890267 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890276 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.890284 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.929363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929467 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929558 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929581 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929593 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929570 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929546729 +0000 UTC m=+89.184444618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929674 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929656086 +0000 UTC m=+89.184553974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929695 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929717 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929731 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: E0310 09:04:54.929775 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:02.929764381 +0000 UTC m=+89.184662270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993320 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993380 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993393 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:54 crc kubenswrapper[4883]: I0310 09:04:54.993420 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:54Z","lastTransitionTime":"2026-03-10T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.079920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.080005 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080096 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.079931 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.080148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080315 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080431 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:55 crc kubenswrapper[4883]: E0310 09:04:55.080639 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.091832 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.095928 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198247 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.198284 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.301889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404288 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.404318 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.506919 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609696 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609738 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609750 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.609782 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.711233 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.813747 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:55 crc kubenswrapper[4883]: I0310 09:04:55.915307 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:55Z","lastTransitionTime":"2026-03-10T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018161 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.018258 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120259 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120311 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.120321 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222773 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.222801 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324678 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324687 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.324705 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426744 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426783 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.426791 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534667 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.534769 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636447 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636520 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636539 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.636551 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738797 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.738806 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.840778 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942513 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:56 crc kubenswrapper[4883]: I0310 09:04:56.942545 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:56Z","lastTransitionTime":"2026-03-10T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.044256 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078921 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078936 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.078948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.079020 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.079047 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.079135 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.080273 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:57 crc kubenswrapper[4883]: E0310 09:04:57.080344 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145726 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.145757 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248127 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.248213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350179 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.350216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.451879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452163 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452185 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.452213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554717 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.554779 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656282 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656301 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.656312 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758681 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.758761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.860826 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.962942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.962997 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:57 crc kubenswrapper[4883]: I0310 09:04:57.963180 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:57Z","lastTransitionTime":"2026-03-10T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065868 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065878 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.065911 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.168727 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252178 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252254 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.252298 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.260855 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263800 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.263824 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.269696 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272656 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.272666 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.278928 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281353 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281363 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.281386 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.287962 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290422 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290432 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290445 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.290455 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.298572 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:04:58 crc kubenswrapper[4883]: E0310 09:04:58.298683 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300556 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300599 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.300608 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403061 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.403127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505471 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505537 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.505580 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607203 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607213 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607227 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.607236 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709556 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.709573 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.811967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812011 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.812059 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914013 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914063 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:58 crc kubenswrapper[4883]: I0310 09:04:58.914111 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:58Z","lastTransitionTime":"2026-03-10T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016152 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016201 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.016228 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.079289 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079607 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.079728 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.079628 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.080014 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.080234 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.081776 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:59 crc kubenswrapper[4883]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 10 09:04:59 crc kubenswrapper[4883]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 10 09:04:59 crc kubenswrapper[4883]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wszn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:59 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.081867 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.082762 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:04:59 crc kubenswrapper[4883]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 10 09:04:59 crc kubenswrapper[4883]: apiVersion: v1 Mar 10 09:04:59 crc kubenswrapper[4883]: clusters: Mar 10 09:04:59 crc kubenswrapper[4883]: - cluster: Mar 10 09:04:59 crc kubenswrapper[4883]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 10 09:04:59 crc kubenswrapper[4883]: server: https://api-int.crc.testing:6443 Mar 10 09:04:59 crc kubenswrapper[4883]: name: default-cluster Mar 10 09:04:59 crc kubenswrapper[4883]: contexts: Mar 10 09:04:59 crc kubenswrapper[4883]: - context: Mar 10 09:04:59 crc kubenswrapper[4883]: cluster: default-cluster Mar 10 09:04:59 crc kubenswrapper[4883]: namespace: default Mar 10 09:04:59 crc kubenswrapper[4883]: user: default-auth Mar 10 09:04:59 crc kubenswrapper[4883]: name: default-context Mar 10 09:04:59 crc kubenswrapper[4883]: current-context: default-context Mar 10 09:04:59 crc kubenswrapper[4883]: kind: Config Mar 10 09:04:59 crc kubenswrapper[4883]: preferences: {} Mar 10 09:04:59 crc kubenswrapper[4883]: users: Mar 10 09:04:59 crc kubenswrapper[4883]: - name: default-auth Mar 10 09:04:59 crc kubenswrapper[4883]: user: Mar 10 09:04:59 crc kubenswrapper[4883]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:59 crc kubenswrapper[4883]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 10 09:04:59 crc kubenswrapper[4883]: EOF Mar 10 09:04:59 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h98t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:04:59 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.083540 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.084322 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58nsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.085811 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:04:59 crc kubenswrapper[4883]: E0310 09:04:59.085901 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118530 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118571 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118583 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.118621 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221456 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221466 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.221514 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323418 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323493 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.323529 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425754 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.425768 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528296 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528315 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.528326 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.630805 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733464 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733524 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.733560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835923 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.835965 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937858 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:04:59 crc kubenswrapper[4883]: I0310 09:04:59.937889 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:04:59Z","lastTransitionTime":"2026-03-10T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039616 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.039626 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: E0310 09:05:00.080688 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:00 crc kubenswrapper[4883]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 10 09:05:00 crc kubenswrapper[4883]: while [ true ]; Mar 10 09:05:00 crc kubenswrapper[4883]: do Mar 10 09:05:00 crc kubenswrapper[4883]: for f in $(ls /tmp/serviceca); do Mar 10 09:05:00 crc kubenswrapper[4883]: echo $f Mar 10 09:05:00 crc kubenswrapper[4883]: ca_file_path="/tmp/serviceca/${f}" Mar 10 09:05:00 crc kubenswrapper[4883]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 10 09:05:00 crc kubenswrapper[4883]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 10 09:05:00 crc kubenswrapper[4883]: if [ -e "${reg_dir_path}" ]; then Mar 10 09:05:00 crc kubenswrapper[4883]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 10 09:05:00 crc kubenswrapper[4883]: else Mar 10 09:05:00 crc kubenswrapper[4883]: mkdir $reg_dir_path Mar 10 09:05:00 crc kubenswrapper[4883]: cp $ca_file_path $reg_dir_path/ca.crt Mar 10 09:05:00 crc kubenswrapper[4883]: fi Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: for d in $(ls /etc/docker/certs.d); do Mar 10 09:05:00 crc kubenswrapper[4883]: echo $d Mar 10 09:05:00 crc kubenswrapper[4883]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 10 09:05:00 crc kubenswrapper[4883]: reg_conf_path="/tmp/serviceca/${dp}" Mar 10 09:05:00 crc kubenswrapper[4883]: if [ ! -e "${reg_conf_path}" ]; then Mar 10 09:05:00 crc kubenswrapper[4883]: rm -rf /etc/docker/certs.d/$d Mar 10 09:05:00 crc kubenswrapper[4883]: fi Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: sleep 60 & wait ${!} Mar 10 09:05:00 crc kubenswrapper[4883]: done Mar 10 09:05:00 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsr4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-vvbjw_openshift-image-registry(53ffac75-0989-4945-915d-4aacec270cdb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:00 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:00 crc kubenswrapper[4883]: E0310 09:05:00.081873 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-vvbjw" podUID="53ffac75-0989-4945-915d-4aacec270cdb" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141752 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.141793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243501 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.243558 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345103 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345157 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.345207 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447586 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447628 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.447638 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549208 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.549220 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.651771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753732 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.753753 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855845 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855880 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855890 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.855908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958082 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:00 crc kubenswrapper[4883]: I0310 09:05:00.958147 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:00Z","lastTransitionTime":"2026-03-10T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060543 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060555 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.060565 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.079847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.079871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.080113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.080187 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080224 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080332 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.080400 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.081895 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: source /etc/kubernetes/apiserver-url.env Mar 10 09:05:01 crc kubenswrapper[4883]: else Mar 10 09:05:01 crc kubenswrapper[4883]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 09:05:01 crc kubenswrapper[4883]: exit 1 Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.081920 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083008 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:01 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 09:05:01 crc kubenswrapper[4883]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 09:05:01 crc kubenswrapper[4883]: ho_enable="--enable-hybrid-overlay" Mar 10 09:05:01 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 09:05:01 crc kubenswrapper[4883]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 09:05:01 crc kubenswrapper[4883]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-host=127.0.0.1 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --webhook-port=9743 \ Mar 10 09:05:01 crc kubenswrapper[4883]: ${ho_enable} \ Mar 10 09:05:01 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:05:01 crc kubenswrapper[4883]: --disable-approver \ Mar 10 09:05:01 crc kubenswrapper[4883]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --wait-for-kubernetes-api=200s \ Mar 10 09:05:01 crc kubenswrapper[4883]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083046 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083304 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-nrzgf_openshift-multus(6c845e62-37a1-473c-a4d0-a354594903bc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.083374 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.084444 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.084504 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podUID="6c845e62-37a1-473c-a4d0-a354594903bc" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.085653 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:01 crc kubenswrapper[4883]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:01 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:01 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:01 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:01 crc kubenswrapper[4883]: fi Mar 10 09:05:01 crc kubenswrapper[4883]: Mar 10 09:05:01 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 09:05:01 crc kubenswrapper[4883]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 09:05:01 crc kubenswrapper[4883]: --disable-webhook \ Mar 10 09:05:01 crc kubenswrapper[4883]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 09:05:01 crc kubenswrapper[4883]: --loglevel="${LOGLEVEL}" Mar 10 09:05:01 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:01 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:01 crc kubenswrapper[4883]: E0310 09:05:01.086760 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162417 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162502 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162534 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.162547 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.264082 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366526 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366560 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366569 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.366599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468931 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468983 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.468998 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.469008 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.570889 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571143 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.571172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674381 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674454 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674468 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674517 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.674532 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.776882 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777036 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777116 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777183 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.777244 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880376 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880388 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880401 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.880408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982647 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982688 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982702 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:01 crc kubenswrapper[4883]: I0310 09:05:01.982724 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:01Z","lastTransitionTime":"2026-03-10T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.082078 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:02 crc kubenswrapper[4883]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:02 crc kubenswrapper[4883]: set -euo pipefail Mar 10 09:05:02 crc kubenswrapper[4883]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 10 09:05:02 crc kubenswrapper[4883]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 10 09:05:02 crc kubenswrapper[4883]: # As the secret mount is optional we must wait for the files to be present. Mar 10 09:05:02 crc kubenswrapper[4883]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 10 09:05:02 crc kubenswrapper[4883]: TS=$(date +%s) Mar 10 09:05:02 crc kubenswrapper[4883]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 10 09:05:02 crc kubenswrapper[4883]: HAS_LOGGED_INFO=0 Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: log_missing_certs(){ Mar 10 09:05:02 crc kubenswrapper[4883]: CUR_TS=$(date +%s) Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 10 09:05:02 crc kubenswrapper[4883]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 10 09:05:02 crc kubenswrapper[4883]: HAS_LOGGED_INFO=1 Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: } Mar 10 09:05:02 crc kubenswrapper[4883]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 10 09:05:02 crc kubenswrapper[4883]: log_missing_certs Mar 10 09:05:02 crc kubenswrapper[4883]: sleep 5 Mar 10 09:05:02 crc kubenswrapper[4883]: done Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 10 09:05:02 crc kubenswrapper[4883]: exec /usr/bin/kube-rbac-proxy \ Mar 10 09:05:02 crc kubenswrapper[4883]: --logtostderr \ Mar 10 09:05:02 crc kubenswrapper[4883]: --secure-listen-address=:9108 \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 10 09:05:02 crc kubenswrapper[4883]: --upstream=http://127.0.0.1:29108/ \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-private-key-file=${TLS_PK} \ Mar 10 09:05:02 crc kubenswrapper[4883]: --tls-cert-file=${TLS_CERT} Mar 10 09:05:02 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:02 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.084333 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:02 crc kubenswrapper[4883]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ -f "/env/_master" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: set -o allexport Mar 10 09:05:02 crc kubenswrapper[4883]: source "/env/_master" Mar 10 09:05:02 crc kubenswrapper[4883]: set +o allexport Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "" != "" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "false" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: persistent_ips_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: # This is needed so that converting clusters from GA to TP Mar 10 09:05:02 crc kubenswrapper[4883]: # will rollout control plane pods as well Mar 10 09:05:02 crc kubenswrapper[4883]: network_segmentation_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: multi_network_enabled_flag= Mar 10 09:05:02 crc kubenswrapper[4883]: if [[ "true" == "true" ]]; then Mar 10 09:05:02 crc kubenswrapper[4883]: multi_network_enabled_flag="--enable-multi-network" Mar 10 09:05:02 crc kubenswrapper[4883]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 10 09:05:02 crc kubenswrapper[4883]: fi Mar 10 09:05:02 crc kubenswrapper[4883]: Mar 10 09:05:02 crc kubenswrapper[4883]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 10 09:05:02 crc kubenswrapper[4883]: exec /usr/bin/ovnkube \ Mar 10 09:05:02 crc kubenswrapper[4883]: --enable-interconnect \ Mar 10 09:05:02 crc kubenswrapper[4883]: --init-cluster-manager "${K8S_NODE}" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 10 09:05:02 crc kubenswrapper[4883]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-bind-address "127.0.0.1:29108" \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-enable-pprof \ Mar 10 09:05:02 crc kubenswrapper[4883]: --metrics-enable-config-duration \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v4_join_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v6_join_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${dns_name_resolver_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${persistent_ips_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${multi_network_enabled_flag} \ Mar 10 09:05:02 crc kubenswrapper[4883]: ${network_segmentation_enabled_flag} Mar 10 09:05:02 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2lkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-x7sm9_openshift-ovn-kubernetes(5fd36c79-e84e-49aa-97b9-616563193cd2): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:02 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.084839 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.085769 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podUID="5fd36c79-e84e-49aa-97b9-616563193cd2" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.095712 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187250 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.187276 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289788 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289855 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.289867 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391900 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391940 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391963 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.391988 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.493818 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596607 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596673 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.596739 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699284 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699310 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.699321 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801301 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.801309 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.902838 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.903009 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.902985897 +0000 UTC m=+105.157883785 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.904049 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.904126 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904306 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904368 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.904351932 +0000 UTC m=+105.159249821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904434 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: E0310 09:05:02.904531 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:18.904460488 +0000 UTC m=+105.159358376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:02 crc kubenswrapper[4883]: I0310 09:05:02.908873 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:02Z","lastTransitionTime":"2026-03-10T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004645 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004685 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.004730 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004776 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004802 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004815 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004820 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004833 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004850 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004864 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004868 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004853123 +0000 UTC m=+105.259751013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004888 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004881457 +0000 UTC m=+105.259779346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.004902 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:19.004896645 +0000 UTC m=+105.259794524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012042 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012053 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012103 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.012112 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079918 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079934 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.079937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080046 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.080058 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080365 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080433 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:03 crc kubenswrapper[4883]: E0310 09:05:03.080387 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113823 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113837 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.113849 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215859 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215869 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.215892 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318204 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.318228 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.420635 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522777 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522799 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.522810 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624839 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624864 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.624884 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.726599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828730 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828756 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.828767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930886 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930947 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930960 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930975 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:03 crc kubenswrapper[4883]: I0310 09:05:03.930986 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:03Z","lastTransitionTime":"2026-03-10T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032611 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032620 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.032641 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.080550 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.080738 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.082351 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:05:04 crc kubenswrapper[4883]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 10 09:05:04 crc kubenswrapper[4883]: set -uo pipefail Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 10 09:05:04 crc kubenswrapper[4883]: HOSTS_FILE="/etc/hosts" Mar 10 09:05:04 crc kubenswrapper[4883]: TEMP_FILE="/etc/hosts.tmp" Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Make a temporary file with the old hosts file's attributes. Mar 10 09:05:04 crc kubenswrapper[4883]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 10 09:05:04 crc kubenswrapper[4883]: echo "Failed to preserve hosts file. Exiting." Mar 10 09:05:04 crc kubenswrapper[4883]: exit 1 Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: while true; do Mar 10 09:05:04 crc kubenswrapper[4883]: declare -A svc_ips Mar 10 09:05:04 crc kubenswrapper[4883]: for svc in "${services[@]}"; do Mar 10 09:05:04 crc kubenswrapper[4883]: # Fetch service IP from cluster dns if present. We make several tries Mar 10 09:05:04 crc kubenswrapper[4883]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 10 09:05:04 crc kubenswrapper[4883]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 10 09:05:04 crc kubenswrapper[4883]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 10 09:05:04 crc kubenswrapper[4883]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 10 09:05:04 crc kubenswrapper[4883]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 10 09:05:04 crc kubenswrapper[4883]: for i in ${!cmds[*]} Mar 10 09:05:04 crc kubenswrapper[4883]: do Mar 10 09:05:04 crc kubenswrapper[4883]: ips=($(eval "${cmds[i]}")) Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: svc_ips["${svc}"]="${ips[@]}" Mar 10 09:05:04 crc kubenswrapper[4883]: break Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Update /etc/hosts only if we get valid service IPs Mar 10 09:05:04 crc kubenswrapper[4883]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 10 09:05:04 crc kubenswrapper[4883]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 10 09:05:04 crc kubenswrapper[4883]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 10 09:05:04 crc kubenswrapper[4883]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: continue Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # Append resolver entries for services Mar 10 09:05:04 crc kubenswrapper[4883]: rc=0 Mar 10 09:05:04 crc kubenswrapper[4883]: for svc in "${!svc_ips[@]}"; do Mar 10 09:05:04 crc kubenswrapper[4883]: for ip in ${svc_ips[${svc}]}; do Mar 10 09:05:04 crc kubenswrapper[4883]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: if [[ $rc -ne 0 ]]; then Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: continue Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: Mar 10 09:05:04 crc kubenswrapper[4883]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 10 09:05:04 crc kubenswrapper[4883]: # Replace /etc/hosts with our modified version if needed Mar 10 09:05:04 crc kubenswrapper[4883]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 10 09:05:04 crc kubenswrapper[4883]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 10 09:05:04 crc kubenswrapper[4883]: fi Mar 10 09:05:04 crc kubenswrapper[4883]: sleep 60 & wait Mar 10 09:05:04 crc kubenswrapper[4883]: unset svc_ips Mar 10 09:05:04 crc kubenswrapper[4883]: done Mar 10 09:05:04 crc kubenswrapper[4883]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqn66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7xb47_openshift-dns(d41077f5-9f66-4be5-bb1a-e0f5b2b078e0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 09:05:04 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:05:04 crc kubenswrapper[4883]: E0310 09:05:04.083464 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7xb47" podUID="d41077f5-9f66-4be5-bb1a-e0f5b2b078e0" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.090606 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.099035 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.107595 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.118943 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.126253 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.132092 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.134339 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.138121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.146608 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.154341 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.160299 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.172059 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.180849 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.186413 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.193189 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.199513 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.205706 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.213228 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.236956 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.236991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237003 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237018 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.237030 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.339278 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.440981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441028 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.441070 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542825 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542837 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542848 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.542860 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.645141 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747358 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747404 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.747440 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.849916 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952210 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952250 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:04 crc kubenswrapper[4883]: I0310 09:05:04.952273 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:04Z","lastTransitionTime":"2026-03-10T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.054191 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079589 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079658 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079699 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.079828 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.079844 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.079947 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.080066 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:05 crc kubenswrapper[4883]: E0310 09:05:05.080201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.156606 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258615 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258645 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258668 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.258678 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.360990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361159 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361240 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361302 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.361358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.462980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.463048 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.532810 4883 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565081 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565122 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.565135 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667153 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667229 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667295 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.667349 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769073 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769388 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.769397 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871322 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871824 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.871972 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974233 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974247 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:05 crc kubenswrapper[4883]: I0310 09:05:05.974256 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:05Z","lastTransitionTime":"2026-03-10T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075662 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.075991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.076075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.076143 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179286 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179571 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.179732 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.282943 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283175 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.283453 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385665 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385710 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.385733 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488054 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488130 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488149 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.488164 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.590647 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692774 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692903 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.692958 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794805 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794864 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.794874 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896757 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896783 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.896793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998432 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998501 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998517 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:06 crc kubenswrapper[4883]: I0310 09:05:06.998526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:06Z","lastTransitionTime":"2026-03-10T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078872 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078918 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.078922 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079019 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.079048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079169 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079298 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:07 crc kubenswrapper[4883]: E0310 09:05:07.079382 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100360 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100374 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.100383 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202391 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202645 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.202765 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304871 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304889 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.304900 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.407871 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510091 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.510141 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612896 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612918 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.612932 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716042 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716065 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.716075 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818433 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818564 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818586 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.818598 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920891 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920930 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920944 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:07 crc kubenswrapper[4883]: I0310 09:05:07.920967 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:07Z","lastTransitionTime":"2026-03-10T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.022988 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023066 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.023078 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125121 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.125261 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227512 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227528 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.227555 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329064 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329125 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329141 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.329152 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430642 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.430683 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529436 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529446 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.529466 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.538653 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541491 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.541560 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.549946 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553165 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.553176 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.561039 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564657 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564707 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564720 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.564751 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.571817 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574629 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.574654 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.584915 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:08 crc kubenswrapper[4883]: E0310 09:05:08.585179 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.586898 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689185 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689229 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689242 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.689250 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791211 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.791235 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892927 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892958 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.892991 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995020 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995056 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:08 crc kubenswrapper[4883]: I0310 09:05:08.995079 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:08Z","lastTransitionTime":"2026-03-10T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079300 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079417 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079573 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079604 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079776 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.079919 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.079973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:09 crc kubenswrapper[4883]: E0310 09:05:09.080074 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096470 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.096509 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198495 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.198508 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.300127 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401545 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.401575 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503276 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503299 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.503332 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604844 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604880 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604890 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.604908 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.705987 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706008 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706025 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.706032 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807575 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807644 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.807669 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909781 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909791 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:09 crc kubenswrapper[4883]: I0310 09:05:09.909819 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:09Z","lastTransitionTime":"2026-03-10T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012278 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012326 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012337 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012355 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.012368 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114153 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114164 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.114187 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216442 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216505 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.216543 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318778 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.318785 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421076 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421085 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.421107 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523305 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.523335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.625465 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727355 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727412 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727444 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.727452 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829709 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829732 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.829741 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932149 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932290 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:10 crc kubenswrapper[4883]: I0310 09:05:10.932301 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:10Z","lastTransitionTime":"2026-03-10T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033776 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033816 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033827 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033841 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.033851 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079274 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079304 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079316 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.079391 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079415 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079560 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079836 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:11 crc kubenswrapper[4883]: E0310 09:05:11.079921 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137952 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.137979 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.240206 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.342233 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.392510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvbjw" event={"ID":"53ffac75-0989-4945-915d-4aacec270cdb","Type":"ContainerStarted","Data":"082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.401263 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.415930 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.425913 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.432065 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.441095 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.443961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.443991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444001 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.444027 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.447915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.454766 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.462560 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.470046 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.477903 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.485884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.497250 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.504295 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.511008 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.517013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.522644 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.529701 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546541 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.546556 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648446 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648493 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648503 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.648526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750219 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.750317 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852513 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852550 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852577 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.852585 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954371 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954408 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954417 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954433 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:11 crc kubenswrapper[4883]: I0310 09:05:11.954445 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:11Z","lastTransitionTime":"2026-03-10T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056080 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056095 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056108 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.056117 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158916 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158964 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158979 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.158992 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261901 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261945 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.261994 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.262006 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364440 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364519 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.364587 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.396447 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" exitCode=0 Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.396554 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.399005 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.399055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.401255 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.401299 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.407216 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.416200 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.426228 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.433296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.441897 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.450169 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.457246 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.463913 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467075 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.467085 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.479909 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.491895 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.502284 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.513074 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.524168 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.533592 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.542532 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570184 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570196 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.570230 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.586120 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.616163 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.628335 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.639881 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.649400 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.658792 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.667558 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673142 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.673184 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.675458 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.686630 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.707298 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.715805 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.727433 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.736071 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.746908 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.756332 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.765763 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.774923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775730 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.775765 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.790282 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.802316 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:12Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878384 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.878761 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981217 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981238 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:12 crc kubenswrapper[4883]: I0310 09:05:12.981246 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:12Z","lastTransitionTime":"2026-03-10T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.079891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.079969 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.080168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.080196 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080309 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080436 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080534 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:13 crc kubenswrapper[4883]: E0310 09:05:13.080760 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084814 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.084854 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.187679 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.187980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188003 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188024 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.188035 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289685 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.289726 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392150 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392212 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.392260 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409695 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409738 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409759 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.409769 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494680 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494750 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.494789 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597101 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.597113 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699930 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699947 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.699982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.700003 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802160 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802209 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.802251 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904540 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:13 crc kubenswrapper[4883]: I0310 09:05:13.904603 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:13Z","lastTransitionTime":"2026-03-10T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006722 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.006767 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.094773 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.105411 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108548 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108582 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108592 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108607 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.108617 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.116567 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.130297 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.140596 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.149620 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.159501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.169066 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.179385 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.189778 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.199084 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210455 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210515 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210531 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.210569 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.213509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.229364 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.245422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.255469 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.264818 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.276235 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313383 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313508 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313605 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313686 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.313745 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418293 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.418306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.419884 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.425555 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.426921 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.430598 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.441465 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.450338 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.458543 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.466605 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.475064 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.488339 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.498910 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.507328 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.517078 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519942 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.519990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.520005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.520016 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.527748 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.537167 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.547321 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.555393 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.567538 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.584178 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.595895 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.605243 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.617043 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623271 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623282 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623299 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.623311 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.627177 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.635501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.646905 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.657022 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.671244 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.684242 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.693058 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.703816 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.714772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725923 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725959 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.725990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.726003 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.728074 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.735965 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.745440 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.756131 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.777137 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.786577 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.828670 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932727 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:14 crc kubenswrapper[4883]: I0310 09:05:14.932813 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:14Z","lastTransitionTime":"2026-03-10T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036343 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.036395 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.038083 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.038140 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.078966 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.079027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079184 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079316 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079532 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:15 crc kubenswrapper[4883]: E0310 09:05:15.079561 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141239 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141274 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141285 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.141306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.243949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244014 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244030 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244053 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.244068 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346884 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346925 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.346961 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449444 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449551 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.449564 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552403 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552728 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.552753 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655431 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655489 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655500 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655516 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.655526 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757698 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757763 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.757808 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860100 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860122 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.860133 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.961974 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962128 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962275 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:15 crc kubenswrapper[4883]: I0310 09:05:15.962343 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:15Z","lastTransitionTime":"2026-03-10T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064753 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064766 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064782 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.064795 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.079780 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:16 crc kubenswrapper[4883]: E0310 09:05:16.079963 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167114 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167174 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.167186 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269273 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269319 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269329 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269345 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.269357 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371259 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371310 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371326 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.371336 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.437931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.439609 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7xb47" event={"ID":"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0","Type":"ContainerStarted","Data":"128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.442461 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.445055 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd" exitCode=0 Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.445095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.454792 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.468452 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473735 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.473775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.479610 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.489027 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.499518 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.514343 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.526242 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.535615 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.546089 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.556108 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.568706 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577305 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577362 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577382 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577397 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.577806 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.586300 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.597031 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.612821 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.623953 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.634898 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.642213 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.651892 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.661639 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.670582 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679549 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679606 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.679638 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.686328 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.697490 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.706280 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.715578 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.724662 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.735322 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.751555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.761664 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.772570 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.782815 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.784672 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.800981 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.811280 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.820612 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:16Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.885959 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886109 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886270 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.886334 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988601 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:16 crc kubenswrapper[4883]: I0310 09:05:16.988629 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:16Z","lastTransitionTime":"2026-03-10T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079266 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079380 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079397 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079536 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.079593 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079682 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:17 crc kubenswrapper[4883]: E0310 09:05:17.079796 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091063 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091097 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.091138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193914 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.193967 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296248 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296321 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.296351 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399592 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399637 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399661 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.399674 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.452685 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.453055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" event={"ID":"5fd36c79-e84e-49aa-97b9-616563193cd2","Type":"ContainerStarted","Data":"28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.455224 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09" exitCode=0 Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.455354 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.467850 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.487258 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.498776 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502335 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502348 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502365 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.502377 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.509884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.519126 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.529648 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.540382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.549274 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.572628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.588192 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.597640 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604980 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.604996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.605006 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.609451 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.619666 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.629652 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.639464 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.649425 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.662279 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.673645 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.683346 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.698780 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707591 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707636 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707651 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707965 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.707998 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.712994 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.724191 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.735652 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.745365 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.754789 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.767988 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.778988 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.789503 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.802346 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810415 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810496 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.810508 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.815951 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.825033 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.834406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.841752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.850582 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:17Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913411 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913454 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913509 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:17 crc kubenswrapper[4883]: I0310 09:05:17.913521 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:17Z","lastTransitionTime":"2026-03-10T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016105 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.016144 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118642 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118661 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.118677 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221272 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221324 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.221354 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324142 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324154 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324174 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.324195 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426620 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426658 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426685 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.426697 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.463575 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.463895 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.467065 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba" exitCode=0 Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.467163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.481033 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.490981 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.493750 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.503772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.512857 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.521628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.528794 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.534257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.554121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.563512 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.572969 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.583800 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.593853 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.603464 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.614111 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.622696 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630742 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630781 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630793 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.630823 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.638774 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.652236 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.660678 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.668420 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.678799 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.690117 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.698224 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.707608 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.716688 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.725379 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732940 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.732983 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.734728 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.749348 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.759414 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.769452 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.779021 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.788408 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789221 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789279 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789294 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.789305 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.797424 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.798907 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801664 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.801687 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.805234 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.810376 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812809 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812840 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812852 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.812861 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.816641 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.824915 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828836 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828869 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.828903 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.837227 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.838647 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841340 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841373 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841397 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.841406 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.850079 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:18Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.850209 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851526 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.851608 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953962 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953975 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.953985 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:18Z","lastTransitionTime":"2026-03-10T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969345 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969536 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969517858 +0000 UTC m=+137.224415737 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:18 crc kubenswrapper[4883]: I0310 09:05:18.969634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969765 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969829 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969872 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969865586 +0000 UTC m=+137.224763475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:18 crc kubenswrapper[4883]: E0310 09:05:18.969963 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:50.969944946 +0000 UTC m=+137.224842835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.056804 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057023 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057099 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.057252 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070709 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.070840 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070909 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070937 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070951 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.070988 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071024 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071047 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071001 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.070989536 +0000 UTC m=+137.325887424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071064 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071111 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.071082491 +0000 UTC m=+137.325980380 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.071236 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:05:51.071165469 +0000 UTC m=+137.326063357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079094 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.079131 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079460 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079591 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079678 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:19 crc kubenswrapper[4883]: E0310 09:05:19.079813 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159741 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159812 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.159823 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262603 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262634 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.262648 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364585 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364617 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364626 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364647 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.364658 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.467463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468051 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468096 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.468116 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.471972 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146" exitCode=0 Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.472058 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.473558 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/0.log" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.477527 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" exitCode=1 Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.477580 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.478306 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.478499 4883 scope.go:117] "RemoveContainer" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.489366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.507351 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.507613 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.523214 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.538530 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.551083 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.561075 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570749 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570787 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570820 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.570835 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.572370 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.583638 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.595008 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.606486 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.618584 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.634509 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.644842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.653330 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.661569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.670976 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672951 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672973 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.672985 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.673007 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.673020 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.682556 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.691926 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.702344 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.719266 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.730062 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.748947 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.759915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.771456 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775518 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775574 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775588 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775609 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.775625 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.789722 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.799555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.822048 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.834341 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.848898 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.871891 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877572 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.877643 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.884653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.897794 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.937176 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.956726 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:19Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979322 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979378 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979396 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:19 crc kubenswrapper[4883]: I0310 09:05:19.979408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:19Z","lastTransitionTime":"2026-03-10T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083408 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083439 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083450 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.083499 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186182 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186222 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186246 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.186257 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288542 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288611 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.288621 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390455 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390522 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.390568 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.482119 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.482670 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/0.log" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485229 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" exitCode=1 Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.485401 4883 scope.go:117] "RemoveContainer" containerID="13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.486102 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:20 crc kubenswrapper[4883]: E0310 09:05:20.486323 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.489278 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0" exitCode=0 Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.489318 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492176 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492214 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492241 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.492252 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.499418 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.513741 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.526358 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.535664 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.552520 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.563805 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.571349 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.583389 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.592405 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594562 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594625 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.594637 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.602288 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.612414 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.622690 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.632419 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.644013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.662443 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.674776 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.683965 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.694438 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697525 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697564 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697576 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697601 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.697616 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.704059 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.713058 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.731894 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.770436 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799701 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799765 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.799775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.806355 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.849005 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.887944 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902502 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902557 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902593 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.902606 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:20Z","lastTransitionTime":"2026-03-10T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.929114 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:20 crc kubenswrapper[4883]: I0310 09:05:20.969004 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004370 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004381 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004397 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.004408 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.009541 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.048129 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079995 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.079923 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.080213 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080311 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080460 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080610 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.080655 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.088929 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107034 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107132 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.107366 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.132409 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.168261 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209618 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209952 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209970 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.209985 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.247525 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313496 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313559 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.313570 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415881 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415895 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.415904 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.498220 4883 generic.go:334] "Generic (PLEG): container finished" podID="6c845e62-37a1-473c-a4d0-a354594903bc" containerID="c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270" exitCode=0 Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.498324 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerDied","Data":"c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.500275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.504297 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:21 crc kubenswrapper[4883]: E0310 09:05:21.504465 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.508292 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519436 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519865 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519444 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519893 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.519906 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.530116 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.546867 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13d1b917aa710d18d1a37ef0d9224a5a74a4c1724234b225892b5f000433a232\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401661 6670 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 09:05:19.401995 6670 obj_retry.go:551] Creating *factory.egressNode crc took: 1.772885ms\\\\nI0310 09:05:19.402027 6670 factory.go:1336] Added *v1.Node event handler 7\\\\nI0310 09:05:19.402065 6670 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0310 09:05:19.402516 6670 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0310 09:05:19.402601 6670 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0310 09:05:19.402639 6670 ovnkube.go:599] Stopped ovnkube\\\\nI0310 09:05:19.402671 6670 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 09:05:19.402735 6670 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.557984 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.565763 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.574319 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.582397 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.607072 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622718 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.622771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.648884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.688330 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726379 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726504 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726623 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.726879 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.727844 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.768042 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.812408 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829619 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829650 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.829665 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.848434 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.886980 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.928144 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931802 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931922 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.931939 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:21Z","lastTransitionTime":"2026-03-10T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:21 crc kubenswrapper[4883]: I0310 09:05:21.968555 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:21Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.007151 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034228 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.034244 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.051677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.091359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.129183 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136690 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136731 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136760 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.136775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.168515 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.208457 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240077 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240119 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.240161 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.247461 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.289176 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.329052 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342106 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342117 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342135 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.342149 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.369848 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.408586 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.443981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444027 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.444062 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.452144 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.488276 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.510638 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" event={"ID":"6c845e62-37a1-473c-a4d0-a354594903bc","Type":"ContainerStarted","Data":"b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.527941 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545658 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545691 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.545705 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.567685 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.610027 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647262 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647296 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647308 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647323 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.647334 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.651897 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.690915 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.726036 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749745 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749789 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749802 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.749813 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.767569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.807006 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.846739 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852591 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.852801 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.887086 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.928249 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956224 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956288 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956328 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.956348 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:22Z","lastTransitionTime":"2026-03-10T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:22 crc kubenswrapper[4883]: I0310 09:05:22.968362 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:22Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.008914 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.053309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.058996 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059059 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.059071 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079118 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.079177 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079337 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079509 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079601 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:23 crc kubenswrapper[4883]: E0310 09:05:23.079665 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.087752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.127103 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160969 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160982 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.160999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.161009 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.167375 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.209366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.248312 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262639 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.262681 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.287885 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:23Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365017 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365069 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.365102 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467650 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467691 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467713 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.467721 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570008 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570047 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570074 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.570089 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671941 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671977 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.671987 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.672000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.672010 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774245 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.774291 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876683 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876736 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.876768 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978328 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978350 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:23 crc kubenswrapper[4883]: I0310 09:05:23.978358 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:23Z","lastTransitionTime":"2026-03-10T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080563 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080602 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080614 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.080643 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.089763 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.096174 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.105772 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.114118 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.122633 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.132934 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.143492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.151903 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.165621 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.177137 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182255 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182281 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182307 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.182322 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.187796 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.200501 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.210320 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.219132 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.228092 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.238018 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.247520 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.259284 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:24Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284155 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.284183 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.386790 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387074 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387151 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.387213 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.488715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489214 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489291 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.489350 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591604 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591618 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.591642 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694297 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694337 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694349 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694365 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.694378 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796944 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796986 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.796998 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.797016 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.797026 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899339 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899352 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899371 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:24 crc kubenswrapper[4883]: I0310 09:05:24.899387 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:24Z","lastTransitionTime":"2026-03-10T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001755 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001768 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001780 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.001789 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079834 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079855 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.079869 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.079952 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080033 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.080041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080100 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:25 crc kubenswrapper[4883]: E0310 09:05:25.080277 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.103955 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.103993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104004 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104019 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.104028 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205703 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205758 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205790 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.205802 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308257 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308269 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308290 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.308306 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410861 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410902 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410919 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.410946 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513061 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513104 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513113 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.513138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615767 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615808 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615822 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.615830 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.718207 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820336 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.820368 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922616 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922662 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922674 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922692 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:25 crc kubenswrapper[4883]: I0310 09:05:25.922702 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:25Z","lastTransitionTime":"2026-03-10T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024427 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024457 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024465 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024494 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.024504 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126404 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126413 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126423 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.126432 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227971 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227992 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.227999 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.228009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.228017 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329772 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329801 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329811 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329819 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.329826 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431451 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431552 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431567 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431590 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.431603 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533121 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533218 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.533281 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635177 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635205 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.635216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736751 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736775 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.736804 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838004 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838031 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838039 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838048 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.838058 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939920 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939936 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:26 crc kubenswrapper[4883]: I0310 09:05:26.939947 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:26Z","lastTransitionTime":"2026-03-10T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041850 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041879 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041899 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.041907 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079722 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079881 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079885 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.079914 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.079996 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.080089 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080125 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:27 crc kubenswrapper[4883]: E0310 09:05:27.080212 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.143961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.143993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144005 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.144031 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.247960 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248275 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248292 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.248304 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350670 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.350780 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453009 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453070 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453084 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.453123 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.527112 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.528709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.529106 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.542306 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.554278 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555400 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555437 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555450 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555467 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.555493 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.564734 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.574356 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.582585 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.591151 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.601577 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.616688 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.627517 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.635223 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.646287 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.657169 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658111 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658157 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658172 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658193 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.658206 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.666119 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.674694 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.682130 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.692298 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.701886 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.717492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:27Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.763994 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764046 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764064 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764082 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.764093 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866822 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866865 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866876 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866897 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.866907 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968888 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968924 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968935 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968949 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:27 crc kubenswrapper[4883]: I0310 09:05:27.968958 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:27Z","lastTransitionTime":"2026-03-10T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071398 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071438 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071453 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.071463 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173829 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173839 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173851 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.173861 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275860 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275905 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275916 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275933 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.275944 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378420 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378579 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378716 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.378793 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481087 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481256 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481317 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481383 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.481436 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583538 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.583967 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.584037 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.584103 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686102 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686113 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686125 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.686137 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788385 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788419 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788430 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.788440 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890770 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.890846 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981872 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981909 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981921 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.981929 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:28 crc kubenswrapper[4883]: E0310 09:05:28.992196 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:28Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995138 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995170 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:28 crc kubenswrapper[4883]: I0310 09:05:28.995200 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:28Z","lastTransitionTime":"2026-03-10T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.004902 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008041 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008091 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.008099 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.017271 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020608 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020631 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020641 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020654 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.020661 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.030324 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033277 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033303 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033313 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033324 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.033335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.042009 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:29Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.042109 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043208 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043234 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043245 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.043275 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078853 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078862 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.078853 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.078957 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079008 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:29 crc kubenswrapper[4883]: E0310 09:05:29.079195 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.144963 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.144992 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145001 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145015 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.145024 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252738 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252776 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252788 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252803 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.252814 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355274 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355287 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.355311 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457705 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457740 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457748 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457762 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.457771 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559514 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559557 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559566 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.559599 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661581 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661632 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661643 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.661653 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763621 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763659 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763669 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763684 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.763695 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865509 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865544 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865565 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.865575 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967518 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967558 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967569 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967584 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:29 crc kubenswrapper[4883]: I0310 09:05:29.967595 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:29Z","lastTransitionTime":"2026-03-10T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069715 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069725 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069737 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.069746 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172234 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.172272 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274638 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274686 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274697 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274714 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.274724 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.376997 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377032 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377043 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377060 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.377070 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479197 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479220 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479232 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479244 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.479254 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581469 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581633 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581759 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.581812 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683258 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683304 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683313 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683327 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.683335 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785773 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785784 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785796 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.785804 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887803 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887813 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887828 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.887837 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990264 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990283 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990298 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:30 crc kubenswrapper[4883]: I0310 09:05:30.990308 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:30Z","lastTransitionTime":"2026-03-10T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079227 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079330 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.079714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:31 crc kubenswrapper[4883]: E0310 09:05:31.079921 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092089 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092098 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092112 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.092123 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193743 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193769 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193779 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193792 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.193800 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296226 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296253 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296263 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296294 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.296305 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399021 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399068 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399094 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.399104 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501649 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501677 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501688 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.501708 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604120 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604148 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604164 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.604172 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706143 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706180 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706191 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706206 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.706216 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808678 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808711 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808721 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808734 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.808743 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910536 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910561 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910570 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910580 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:31 crc kubenswrapper[4883]: I0310 09:05:31.910588 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:31Z","lastTransitionTime":"2026-03-10T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012943 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012979 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.012990 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.013002 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.013010 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115079 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115107 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115129 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.115138 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217314 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217356 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217368 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217387 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.217399 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319088 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319128 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.319168 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422134 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422194 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422216 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.422229 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524452 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524463 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524498 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.524509 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626798 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626835 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626846 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.626872 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729488 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729533 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729568 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.729579 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833265 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833306 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833316 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833329 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.833339 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935655 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935709 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935719 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935739 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:32 crc kubenswrapper[4883]: I0310 09:05:32.935754 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:32Z","lastTransitionTime":"2026-03-10T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038351 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038399 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038409 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038429 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.038441 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079688 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079721 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.079731 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.079846 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080023 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080305 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:33 crc kubenswrapper[4883]: E0310 09:05:33.080523 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.080593 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141227 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141249 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141268 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.141290 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243230 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243510 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243521 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243535 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.243544 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345693 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345733 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345746 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345764 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.345775 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447932 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447968 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447978 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.447993 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.448002 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549594 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549640 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549652 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549671 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.549687 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.550400 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.564131 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.564742 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.582884 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.593342 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.608550 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.619004 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.627135 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.636840 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.643984 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652056 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652092 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652102 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652115 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.652125 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.653591 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.661712 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.669170 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.684714 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.695311 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.704372 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.713777 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.722559 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.731719 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.746868 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755794 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755826 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755838 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.755869 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.756875 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.857981 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858038 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858049 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858071 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.858083 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960660 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960700 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960723 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:33 crc kubenswrapper[4883]: I0310 09:05:33.960732 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:33Z","lastTransitionTime":"2026-03-10T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.061459 4883 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.090705 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.101638 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.111698 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.120793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.138296 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.149453 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.151182 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.158345 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.169051 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.177497 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.187327 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.196940 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.205708 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.214777 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.224317 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.238079 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.246737 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.254854 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.262864 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.569710 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.570449 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/1.log" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573626 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" exitCode=1 Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573672 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76"} Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.573715 4883 scope.go:117] "RemoveContainer" containerID="f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.574268 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:34 crc kubenswrapper[4883]: E0310 09:05:34.574458 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.591867 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.601877 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.609946 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.618536 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.630134 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.639368 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.648708 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.657062 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.674653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a76992cf81082ddd99edd68c1499b7ca1155befd50b5187ef97e446a09df85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 09:05:20.376085 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 09:05:20.376108 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 09:05:20.376152 6819 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 09:05:20.376181 6819 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 09:05:20.376211 6819 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 09:05:20.376248 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 09:05:20.376290 6819 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 09:05:20.376364 6819 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 09:05:20.376370 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 09:05:20.376376 6819 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 09:05:20.376387 6819 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 09:05:20.376462 6819 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 09:05:20.376573 6819 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 09:05:20.376650 6819 factory.go:656] Stopping watch factory\\\\nI0310 09:05:20.376673 6819 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.690382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.698159 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.707600 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.716419 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.724752 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.733245 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.742331 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.752073 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:34 crc kubenswrapper[4883]: I0310 09:05:34.761631 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:34Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079808 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079852 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.079917 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.079998 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080088 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.080192 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080343 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.080201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.579121 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.584572 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:35 crc kubenswrapper[4883]: E0310 09:05:35.584761 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.595406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.606489 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.616912 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.625148 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.635933 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.651962 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.660679 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.667896 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.676872 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.688923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.698712 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.707658 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.716844 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.725928 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.739677 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.749599 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.757793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:35 crc kubenswrapper[4883]: I0310 09:05:35.767001 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:35Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079608 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079621 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.079762 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:37 crc kubenswrapper[4883]: I0310 09:05:37.079774 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.079867 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.080091 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:37 crc kubenswrapper[4883]: E0310 09:05:37.080138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.517272 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.528976 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.539422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.548639 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.557978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.566859 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.574400 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.583745 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.598700 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.608015 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.616540 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.625549 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.635797 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.642960 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.651745 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.659921 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.668386 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.676996 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:38 crc kubenswrapper[4883]: I0310 09:05:38.694802 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:38Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079378 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079439 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079448 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.079521 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079539 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.079821 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.080147 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.153061 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303133 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303169 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303181 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303198 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.303210 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.313685 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317199 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317300 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317315 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317338 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.317350 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.326232 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328857 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328887 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328898 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328907 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.328917 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.337707 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339856 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339885 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339894 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339905 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.339913 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.348352 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350882 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350904 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350924 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:39 crc kubenswrapper[4883]: I0310 09:05:39.350932 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:39Z","lastTransitionTime":"2026-03-10T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.359268 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:39Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:39 crc kubenswrapper[4883]: E0310 09:05:39.359395 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079171 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079268 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079328 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079372 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079520 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079700 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:41 crc kubenswrapper[4883]: I0310 09:05:41.079761 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:41 crc kubenswrapper[4883]: E0310 09:05:41.079886 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079198 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079644 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079723 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079324 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079790 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:43 crc kubenswrapper[4883]: I0310 09:05:43.079216 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:43 crc kubenswrapper[4883]: E0310 09:05:43.079837 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.090704 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.098842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.107505 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.115269 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.122927 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.131105 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.146451 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: E0310 09:05:44.153568 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.156281 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.166117 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.173950 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.182406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.189628 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.196463 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.205576 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.218309 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.226253 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.233603 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:44 crc kubenswrapper[4883]: I0310 09:05:44.242426 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:44Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079077 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079068 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:45 crc kubenswrapper[4883]: I0310 09:05:45.079056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079181 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079395 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079429 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:45 crc kubenswrapper[4883]: E0310 09:05:45.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079211 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079285 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079303 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.079373 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079388 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079526 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:47 crc kubenswrapper[4883]: I0310 09:05:47.080262 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.080430 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:47 crc kubenswrapper[4883]: E0310 09:05:47.079620 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079872 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079923 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.079892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.079990 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080148 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080386 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.080712 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.080984 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.154449 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645489 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645523 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645532 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645546 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.645559 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.656285 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659188 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659200 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659235 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.659245 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.668048 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671058 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671110 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671123 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671139 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.671149 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.679879 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682646 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682683 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682695 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682712 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.682722 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.691362 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698126 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698171 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698192 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698206 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:49 crc kubenswrapper[4883]: I0310 09:05:49.698217 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:49Z","lastTransitionTime":"2026-03-10T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.710167 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:49Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:49 crc kubenswrapper[4883]: E0310 09:05:49.710290 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049111 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049273 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049242862 +0000 UTC m=+201.304140761 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.049357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049409 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049556 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049563 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049530219 +0000 UTC m=+201.304428108 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.049698 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.049644026 +0000 UTC m=+201.304541915 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079379 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079465 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079604 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.079618 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079833 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.079906 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.080028 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.088907 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149805 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149864 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:51 crc kubenswrapper[4883]: I0310 09:05:51.149903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150053 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150069 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150081 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150114 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150105202 +0000 UTC m=+201.405003092 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150053 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150168 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150198 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150184624 +0000 UTC m=+201.405082523 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150208 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150231 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:51 crc kubenswrapper[4883]: E0310 09:05:51.150292 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:06:55.150276108 +0000 UTC m=+201.405174007 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079175 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079211 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079244 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:53 crc kubenswrapper[4883]: I0310 09:05:53.079279 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079350 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079512 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079604 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:53 crc kubenswrapper[4883]: E0310 09:05:53.079651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.094366 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.104528 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.116457 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.126126 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.133978 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.143489 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: E0310 09:05:54.155278 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.158843 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.174842 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.184205 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.193779 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.201505 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.212068 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.221439 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.229594 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.238308 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.246795 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.259257 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.270439 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:54 crc kubenswrapper[4883]: I0310 09:05:54.278047 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:54Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.078914 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.078972 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.079027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:55 crc kubenswrapper[4883]: I0310 09:05:55.079056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079212 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079314 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:55 crc kubenswrapper[4883]: E0310 09:05:55.079397 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079264 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079432 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079523 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079618 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:57 crc kubenswrapper[4883]: I0310 09:05:57.079715 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.079759 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:57 crc kubenswrapper[4883]: E0310 09:05:57.080112 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.080012 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.659194 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.661564 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.662083 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.676068 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.687790 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.700187 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.713396 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.721985 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.732906 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.742264 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.751406 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.760679 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.775413 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.785890 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.798389 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.808470 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.819071 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.835025 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.844241 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.855221 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.866196 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:58 crc kubenswrapper[4883]: I0310 09:05:58.881158 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:58Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079035 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079148 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.079048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079281 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079206 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079401 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.079538 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.156385 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.666559 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.667087 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/2.log" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670008 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" exitCode=1 Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670127 4883 scope.go:117] "RemoveContainer" containerID="bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.670977 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.671202 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.685911 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.695160 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.705553 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.714710 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.725492 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.740725 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.752417 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.761083 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.770684 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.780339 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.789013 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.798415 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.807121 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.814724 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.825921 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.839569 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.848809 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858324 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858699 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858758 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858771 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858795 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.858812 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.868653 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.870231 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873708 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873747 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873761 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873782 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.873795 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.883448 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886435 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886553 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886610 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886680 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.886746 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.895968 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899303 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899386 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.899449 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.908271 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910815 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910863 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910875 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910892 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:05:59 crc kubenswrapper[4883]: I0310 09:05:59.910904 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:05:59Z","lastTransitionTime":"2026-03-10T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.926297 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:59Z is after 2025-08-24T17:21:41Z" Mar 10 09:05:59 crc kubenswrapper[4883]: E0310 09:05:59.926415 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.675870 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.675929 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" exitCode=1 Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.676003 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3"} Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.676377 4883 scope.go:117] "RemoveContainer" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.678830 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.683731 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:00 crc kubenswrapper[4883]: E0310 09:06:00.683858 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.686994 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.696138 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.715446 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd721e83a8ff0bdc21ee9f929ef5284b2e6006d814243d86be94536e91c96f76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:33Z\\\",\\\"message\\\":\\\"rplf\\\\nF0310 09:05:33.772408 7084 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:05:33Z is after 2025-08-24T17:21:41Z]\\\\nI0310 09:05:33.772414 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772426 7084 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772431 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 09:05:33.772439 7084 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-7xb47\\\\nI0310 09:05:33.772445 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.729595 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.738331 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.749715 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.759054 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.769401 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.779232 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.788103 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.797281 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.809761 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.825147 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.836793 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.846256 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.854778 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.863623 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.870437 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.879359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.886234 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.892814 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.903359 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.916367 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.925050 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.933317 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.941510 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.948305 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.957150 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.965961 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.974656 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.983053 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:00 crc kubenswrapper[4883]: I0310 09:06:00.990990 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:00Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.012624 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.027078 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.036788 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.049077 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.060084 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.069080 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079580 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079665 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079734 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.079748 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.079894 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080032 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:01 crc kubenswrapper[4883]: E0310 09:06:01.080237 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.689160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.689245 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.702301 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.712571 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.721496 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.730575 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.739524 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.747704 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.754927 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.765109 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.780319 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.789003 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.796468 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.806202 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.816995 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.825311 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.834274 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.843943 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.852498 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.860788 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:01 crc kubenswrapper[4883]: I0310 09:06:01.875141 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:01Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079571 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.079728 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.079819 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080022 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080117 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:03 crc kubenswrapper[4883]: I0310 09:06:03.080163 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:03 crc kubenswrapper[4883]: E0310 09:06:03.080304 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.090504 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.101609 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.110422 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.120180 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.126676 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.134851 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.141923 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.149397 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: E0310 09:06:04.156909 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.160528 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.174382 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.183610 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.192551 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.200534 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.208868 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.217292 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.224088 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.232244 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.240649 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:04 crc kubenswrapper[4883]: I0310 09:06:04.252869 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:04Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079509 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079640 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079650 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:05 crc kubenswrapper[4883]: I0310 09:06:05.079807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079852 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079920 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:05 crc kubenswrapper[4883]: E0310 09:06:05.079977 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.079830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.080357 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.080606 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.080839 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:07 crc kubenswrapper[4883]: I0310 09:06:07.081248 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082452 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082510 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:07 crc kubenswrapper[4883]: E0310 09:06:07.082250 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079652 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079679 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079735 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.079755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.079805 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.079932 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.080066 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.080107 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.157884 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977836 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977883 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977894 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977915 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.977927 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:09Z","lastTransitionTime":"2026-03-10T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:09 crc kubenswrapper[4883]: E0310 09:06:09.990205 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:09Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993805 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993852 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993870 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:09 crc kubenswrapper[4883]: I0310 09:06:09.993881 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:09Z","lastTransitionTime":"2026-03-10T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.003234 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006118 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006145 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006156 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006167 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.006174 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.015247 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018189 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018225 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018236 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018252 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.018263 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.027724 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030554 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030587 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030596 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030613 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:10 crc kubenswrapper[4883]: I0310 09:06:10.030623 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:10Z","lastTransitionTime":"2026-03-10T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.039795 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:10Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:10 crc kubenswrapper[4883]: E0310 09:06:10.039956 4883 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079798 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079792 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.079964 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.079994 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.080301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.080576 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:11 crc kubenswrapper[4883]: I0310 09:06:11.080794 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:11 crc kubenswrapper[4883]: E0310 09:06:11.081192 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079742 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079774 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.079897 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:13 crc kubenswrapper[4883]: I0310 09:06:13.079753 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080103 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080162 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:13 crc kubenswrapper[4883]: E0310 09:06:13.080262 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.080643 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:14 crc kubenswrapper[4883]: E0310 09:06:14.080828 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.090223 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.099599 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eda4645c256f9767b69f81bd667fa59dec86669fb0024b847ab32f1d1200711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d3638a4be088b1c26f67f367fc642fa397fc59b6bf7a804a16ff495a903f0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.109542 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ae85e4e3437bb065a301cbc2c172e45ebc6ae8b7fb4f7fe5d29478f2afad336\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.122224 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cd22653-be30-44cc-a583-0269f5c4d2d7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdffa45729fb51dfb6f084fbbd16f03c7691f9e3b2bce877264cd6ce4955e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bda2cc98eb654b2cf16d97b3c7220295e68e338e81ce25161c924d344c99c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b156e88d9f93fd752757476e27717790b1a7853622f141612db6b51e2b80b28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b2c32514993b3d08b1c6f387c8cca3e892f93a07039554f968f9b584778d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba7c69943b6b2d9ad285474d63571b96f266405dbc3b908282ee2c4065c0bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6114cc921e1f9e87d453ab3099b0462703f411c4254827da5d0b05547a314681\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa53e6d273b54222cbce53d5d5c2b825328cb0074dd649f73ba6222d90ac4902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99fa5a72d7a9ea183413e7b5abf4f37915982a21c83792f844d61979c7ecfd6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.130308 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e388aae3-c4b6-4377-9735-44b2149b3007\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7ed71e9a45bd780f292610f7c76b0418caa69eb4d02633ef39fdb4f159d1ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6b8668f001d471da29216b1f6c175e0fcda6c08b096db264260dd335107214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cfcb6be0162002c3242565a865d532ac6a7b7dd8f73a9487a8763adaef31f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cf763df82a027e70bc3206458515f4218bf4e0101ecb98efa54795a53cc3a88\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.138187 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.145437 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd6597a3-f861-4126-933e-d6134c8bd4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64vnf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gmq5n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.151967 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ba16102-966f-4777-a385-01b6afe041ec\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd325b6244ababb4e841babcb69944542796c1eeaaac146e2c4eb7744b04d825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe7d47d7995c041bc0db51c6092421f508ac173c85eccd5d2a5f33423ffab08d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: E0310 09:06:14.158723 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.160996 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16dbc720-9dd7-4b18-a9da-02204c723f2c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T09:04:45Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 09:04:45.508766 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 09:04:45.508927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 09:04:45.509562 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3086184017/tls.crt::/tmp/serving-cert-3086184017/tls.key\\\\\\\"\\\\nI0310 09:04:45.618402 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 09:04:45.620161 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 09:04:45.620179 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 09:04:45.620197 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 09:04:45.620202 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 09:04:45.626109 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 09:04:45.626183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 09:04:45.626226 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 09:04:45.626244 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0310 09:04:45.626126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0310 09:04:45.626261 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 09:04:45.626303 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0310 09:04:45.628389 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.169822 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-p898z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e883c29-520e-4b1f-b49c-3df10450d467\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:59Z\\\",\\\"message\\\":\\\"2026-03-10T09:05:14+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34\\\\n2026-03-10T09:05:14+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef95ad6b-ade1-4a8c-83fa-9fc072c51a34 to /host/opt/cni/bin/\\\\n2026-03-10T09:05:14Z [verbose] multus-daemon started\\\\n2026-03-10T09:05:14Z [verbose] Readiness Indicator file check\\\\n2026-03-10T09:05:59Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wszn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-p898z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.182369 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bc34996-99c9-4f4d-a219-409f6081a593\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:03:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://043b0baf564741812c81d2022c44cbcceaf8f2e0eb094306cb0e7856a839f3b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10cb59a85571589df57cb70751a1e2d6ecf54f344f6ba6041478019e5602d332\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ebf6ccf0d5a0ca88d419b23b1a266ac454ed6685145335eca2939feabc39ed9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:03:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:03:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.189107 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vvbjw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53ffac75-0989-4945-915d-4aacec270cdb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://082212ecaf6bcef64e6266da0dc28eb27efbcf08e9b0bcb279ba9703196889ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qsr4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vvbjw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.202513 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T09:05:58Z\\\",\\\"message\\\":\\\" 7350 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-p898z after 0 failed attempt(s)\\\\nI0310 09:05:58.765778 7350 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 09:05:58.765786 7350 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-p898z\\\\nI0310 09:05:58.764586 7350 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765798 7350 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0310 09:05:58.765807 7350 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0310 09:05:58.765843 7350 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nF0310 09:05:58.765851 7350 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h98t5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pzdml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.213314 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c845e62-37a1-473c-a4d0-a354594903bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dad4298411f5a39d26fd792deac0a703f7bffd1c9bcad7e3b1ada519c271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5df609405f42c52c16cb80151d888064d27e355fbeb05504e4ae03d26a29efbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://471b75c68b3919425240e2a0f9ccbf1a3647532b63b5c9de0e2a1bced2509a09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a592151d48a555088503f7e0ed21b8c509ab878f2e37e51771af3b3aea02aba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d29a350f246b27918115c835a199d9fc2ac8c07fb85b93d2b1ffb15cbb7c146\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1667dd6d7d16dba44ae6c12b78c491ee9c1eba8d4641a8f1458e67c5bc0df8e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c62056a915fea7fa791ca88a10121b36afea44c72accf7d5abda3532185b9270\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T09:05:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T09:05:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9zdjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nrzgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.220787 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7xb47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d41077f5-9f66-4be5-bb1a-e0f5b2b078e0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://128b43d64487c59512f9f47e75714e76295c583538b730ede5fa5e8834567a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7xb47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.228801 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.236571 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99873383-15b6-42ee-a65f-7917294d2e02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e48e2b71e89fc30ca9bde38bfd5f561f068b5e989709388e5f6d139eb016de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-58nsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zxzn8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.245042 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d7084e9b3b0127f2853b9407716a7955531658b83eaa2e42b16936e10dc4636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:14 crc kubenswrapper[4883]: I0310 09:06:14.252201 4883 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fd36c79-e84e-49aa-97b9-616563193cd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:04:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28a8b66df1dcdc75971cf31261239f30efa4e725ea2b65d0e920f7394dba2f30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8304ea14e4224675a7aa75079c8c07ecd42ee1e1945a7c714deac420883ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T09:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2lkr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T09:04:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-x7sm9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:14Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078851 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078987 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.078985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079156 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:15 crc kubenswrapper[4883]: I0310 09:06:15.079193 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079214 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079282 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:15 crc kubenswrapper[4883]: E0310 09:06:15.079438 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079500 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079587 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:17 crc kubenswrapper[4883]: I0310 09:06:17.079718 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079831 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:17 crc kubenswrapper[4883]: E0310 09:06:17.079931 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079415 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079469 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079578 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079423 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:19 crc kubenswrapper[4883]: I0310 09:06:19.079634 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079741 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079826 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.079875 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:19 crc kubenswrapper[4883]: E0310 09:06:19.159156 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152786 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152831 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152842 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152862 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.152872 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.164169 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167377 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167414 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167424 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167438 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.167449 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.176198 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.178961 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.178991 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179000 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179025 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.179038 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: E0310 09:06:20.187365 4883 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ffe6628-2ca8-4f77-b1d4-26329720410f\\\",\\\"systemUUID\\\":\\\"f194c89e-85d8-4bba-8c7a-70d8bbd420b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T09:06:20Z is after 2025-08-24T17:21:41Z" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189891 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189926 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189939 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189953 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.189965 4883 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T09:06:20Z","lastTransitionTime":"2026-03-10T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.231124 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9"] Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.231537 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.232811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.233223 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.233406 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.234390 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.247362 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=93.247341853 podStartE2EDuration="1m33.247341853s" podCreationTimestamp="2026-03-10 09:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.247122856 +0000 UTC m=+166.502020745" watchObservedRunningTime="2026-03-10 09:06:20.247341853 +0000 UTC m=+166.502239742" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.263436 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.2634123 podStartE2EDuration="1m18.2634123s" podCreationTimestamp="2026-03-10 09:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.263253729 +0000 UTC m=+166.518151618" watchObservedRunningTime="2026-03-10 09:06:20.2634123 +0000 UTC m=+166.518310200" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.272034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=29.271675544 podStartE2EDuration="29.271675544s" podCreationTimestamp="2026-03-10 09:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.271277947 +0000 UTC m=+166.526175837" watchObservedRunningTime="2026-03-10 09:06:20.271675544 +0000 UTC m=+166.526573433" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286091 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286163 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.286203 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.298430 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=85.298412722 podStartE2EDuration="1m25.298412722s" podCreationTimestamp="2026-03-10 09:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.298271242 +0000 UTC m=+166.553169131" watchObservedRunningTime="2026-03-10 09:06:20.298412722 +0000 UTC m=+166.553310610" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.305495 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vvbjw" podStartSLOduration=138.305464267 podStartE2EDuration="2m18.305464267s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.305096387 +0000 UTC m=+166.559994276" watchObservedRunningTime="2026-03-10 09:06:20.305464267 +0000 UTC m=+166.560362156" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.314927 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p898z" podStartSLOduration=138.314913769 podStartE2EDuration="2m18.314913769s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.31459358 +0000 UTC m=+166.569491479" watchObservedRunningTime="2026-03-10 09:06:20.314913769 +0000 UTC m=+166.569811658" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.332204 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=56.33218832 podStartE2EDuration="56.33218832s" podCreationTimestamp="2026-03-10 09:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.331774572 +0000 UTC m=+166.586672461" watchObservedRunningTime="2026-03-10 09:06:20.33218832 +0000 UTC m=+166.587086209" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.344212 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-x7sm9" podStartSLOduration=138.344192305 podStartE2EDuration="2m18.344192305s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.344073349 +0000 UTC m=+166.598971238" watchObservedRunningTime="2026-03-10 09:06:20.344192305 +0000 UTC m=+166.599090194" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.382364 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nrzgf" podStartSLOduration=138.382349045 podStartE2EDuration="2m18.382349045s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.381867819 +0000 UTC m=+166.636765707" watchObservedRunningTime="2026-03-10 09:06:20.382349045 +0000 UTC m=+166.637246934" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387215 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387269 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387379 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387584 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.387637 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a87c79-f274-4b63-9efd-b7ab322e8567-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.388287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a87c79-f274-4b63-9efd-b7ab322e8567-service-ca\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.391931 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a87c79-f274-4b63-9efd-b7ab322e8567-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.394723 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7xb47" podStartSLOduration=138.394713547 podStartE2EDuration="2m18.394713547s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.393961715 +0000 UTC m=+166.648859604" watchObservedRunningTime="2026-03-10 09:06:20.394713547 +0000 UTC m=+166.649611426" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.400183 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a87c79-f274-4b63-9efd-b7ab322e8567-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-llks9\" (UID: \"16a87c79-f274-4b63-9efd-b7ab322e8567\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.422639 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podStartSLOduration=138.422617727 podStartE2EDuration="2m18.422617727s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.422539979 +0000 UTC m=+166.677437867" watchObservedRunningTime="2026-03-10 09:06:20.422617727 +0000 UTC m=+166.677515605" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.542762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.744179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" event={"ID":"16a87c79-f274-4b63-9efd-b7ab322e8567","Type":"ContainerStarted","Data":"1a413818aabae19edcf2afd8ad0856cc578e90ff4b9bcb2cef1d239e938c8388"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.744419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" event={"ID":"16a87c79-f274-4b63-9efd-b7ab322e8567","Type":"ContainerStarted","Data":"23873ee8e0141ab19e7238d4b2496a8877c5418d90616f1cedea30304728607a"} Mar 10 09:06:20 crc kubenswrapper[4883]: I0310 09:06:20.757144 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-llks9" podStartSLOduration=138.757127639 podStartE2EDuration="2m18.757127639s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:20.756805225 +0000 UTC m=+167.011703114" watchObservedRunningTime="2026-03-10 09:06:20.757127639 +0000 UTC m=+167.012025529" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079220 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079259 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079299 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.079282 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079365 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079499 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079611 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:21 crc kubenswrapper[4883]: E0310 09:06:21.079688 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.144628 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 09:06:21 crc kubenswrapper[4883]: I0310 09:06:21.150895 4883 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079492 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079559 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.079573 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.079659 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.079884 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:23 crc kubenswrapper[4883]: I0310 09:06:23.080104 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.080151 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:23 crc kubenswrapper[4883]: E0310 09:06:23.080449 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:24 crc kubenswrapper[4883]: E0310 09:06:24.160617 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.078973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079108 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079139 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079206 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079333 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079437 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.079775 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:25 crc kubenswrapper[4883]: I0310 09:06:25.079927 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:25 crc kubenswrapper[4883]: E0310 09:06:25.080106 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079325 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079407 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079444 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079466 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:27 crc kubenswrapper[4883]: I0310 09:06:27.079467 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079557 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079649 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:27 crc kubenswrapper[4883]: E0310 09:06:27.079787 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078849 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078876 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078905 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.078968 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:29 crc kubenswrapper[4883]: I0310 09:06:29.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079090 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079232 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.079338 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:29 crc kubenswrapper[4883]: E0310 09:06:29.162080 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079645 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079739 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.079798 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.079831 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080008 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080126 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:31 crc kubenswrapper[4883]: I0310 09:06:31.080377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:31 crc kubenswrapper[4883]: E0310 09:06:31.080448 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079355 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079425 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079506 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079585 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:33 crc kubenswrapper[4883]: I0310 09:06:33.079453 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079669 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079751 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:33 crc kubenswrapper[4883]: E0310 09:06:33.079826 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:34 crc kubenswrapper[4883]: E0310 09:06:34.162964 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079351 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079400 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079557 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:35 crc kubenswrapper[4883]: I0310 09:06:35.079581 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079729 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079806 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:35 crc kubenswrapper[4883]: E0310 09:06:35.079868 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:36 crc kubenswrapper[4883]: I0310 09:06:36.080607 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:36 crc kubenswrapper[4883]: E0310 09:06:36.080899 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pzdml_openshift-ovn-kubernetes(fc928c48-1df8-4c31-986e-eba2aa7a1c0b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079867 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079935 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:37 crc kubenswrapper[4883]: I0310 09:06:37.079867 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080054 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080230 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080270 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:37 crc kubenswrapper[4883]: E0310 09:06:37.080586 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.078993 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.079084 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:39 crc kubenswrapper[4883]: I0310 09:06:39.079183 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079178 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079232 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079277 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.079376 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:39 crc kubenswrapper[4883]: E0310 09:06:39.164640 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079177 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080043 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079317 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080229 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079289 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080425 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:41 crc kubenswrapper[4883]: I0310 09:06:41.079563 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:41 crc kubenswrapper[4883]: E0310 09:06:41.080628 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.079783 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.079976 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.080027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:43 crc kubenswrapper[4883]: I0310 09:06:43.080190 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080200 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080324 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080501 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:43 crc kubenswrapper[4883]: E0310 09:06:43.080610 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:44 crc kubenswrapper[4883]: E0310 09:06:44.165287 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.079795 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.079960 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080151 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:45 crc kubenswrapper[4883]: I0310 09:06:45.080324 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080533 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:45 crc kubenswrapper[4883]: E0310 09:06:45.080651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.830003 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831021 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/0.log" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831085 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" exitCode=1 Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831124 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831173 4883 scope.go:117] "RemoveContainer" containerID="8e47ade79120372f314a1b59dbf78e87638709229ddd3fd1038213673fc537e3" Mar 10 09:06:46 crc kubenswrapper[4883]: I0310 09:06:46.831785 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:06:46 crc kubenswrapper[4883]: E0310 09:06:46.832031 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.079956 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.079981 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080175 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.080008 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080370 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:47 crc kubenswrapper[4883]: E0310 09:06:47.080175 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:47 crc kubenswrapper[4883]: I0310 09:06:47.836164 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079865 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079973 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.079985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:49 crc kubenswrapper[4883]: I0310 09:06:49.080006 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080134 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080270 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080542 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.080618 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:49 crc kubenswrapper[4883]: E0310 09:06:49.166640 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079247 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079302 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.079253 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079413 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079569 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.079732 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.080027 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.080252 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.752272 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmq5n"] Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.852427 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.855809 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:51 crc kubenswrapper[4883]: E0310 09:06:51.855984 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.856224 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerStarted","Data":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:06:51 crc kubenswrapper[4883]: I0310 09:06:51.856589 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079436 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079506 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079640 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:53 crc kubenswrapper[4883]: I0310 09:06:53.079739 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079934 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.079993 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:53 crc kubenswrapper[4883]: E0310 09:06:53.080045 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:54 crc kubenswrapper[4883]: E0310 09:06:54.167429 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078520 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078596 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078650 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078750 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078709743 +0000 UTC m=+323.333607642 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078788 4883 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078818 4883 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078878 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078853193 +0000 UTC m=+323.333751082 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.078901 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.078891394 +0000 UTC m=+323.333789283 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078944 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.079017 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.078988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079094 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079201 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079293 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.079355 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179373 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:55 crc kubenswrapper[4883]: I0310 09:06:55.179395 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179517 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179523 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179559 4883 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179624 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs podName:bd6597a3-f861-4126-933e-d6134c8bd4b5 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179609232 +0000 UTC m=+323.434507120 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs") pod "network-metrics-daemon-gmq5n" (UID: "bd6597a3-f861-4126-933e-d6134c8bd4b5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179569 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179681 4883 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179535 4883 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179736 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179721132 +0000 UTC m=+323.434619022 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179741 4883 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:55 crc kubenswrapper[4883]: E0310 09:06:55.179773 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 09:08:57.179764654 +0000 UTC m=+323.434662533 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079582 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079667 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.079758 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:57 crc kubenswrapper[4883]: I0310 09:06:57.079830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.079939 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.080076 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:57 crc kubenswrapper[4883]: E0310 09:06:57.080219 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.079774 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.121140 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podStartSLOduration=176.121123091 podStartE2EDuration="2m56.121123091s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:06:51.898222421 +0000 UTC m=+198.153120300" watchObservedRunningTime="2026-03-10 09:06:58.121123091 +0000 UTC m=+204.376020981" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.878468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:06:58 crc kubenswrapper[4883]: I0310 09:06:58.878551 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d"} Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.080791 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079873 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.080954 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:06:59 crc kubenswrapper[4883]: I0310 09:06:59.079840 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.081056 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.081108 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:06:59 crc kubenswrapper[4883]: E0310 09:06:59.169218 4883 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079495 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079572 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079502 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.079651 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:07:01 crc kubenswrapper[4883]: I0310 09:07:01.079763 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.079945 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.080096 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:07:01 crc kubenswrapper[4883]: E0310 09:07:01.080265 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079197 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079144 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079293 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gmq5n" podUID="bd6597a3-f861-4126-933e-d6134c8bd4b5" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079348 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 09:07:03 crc kubenswrapper[4883]: I0310 09:07:03.079257 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079522 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 09:07:03 crc kubenswrapper[4883]: E0310 09:07:03.079519 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079447 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079459 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.079600 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.080086 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082401 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082401 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.082726 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083060 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083242 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:07:05 crc kubenswrapper[4883]: I0310 09:07:05.083463 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.395345 4883 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.429684 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430407 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.430857 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431203 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431318 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431425 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.431779 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437176 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437270 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437720 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437321 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437534 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437540 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437894 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.437899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438127 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438277 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.438417 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439349 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.439823 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440156 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440469 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.440914 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441626 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441641 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.441796 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.442048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.442853 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.443291 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.443704 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444024 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444249 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444809 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.444849 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.445008 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.445196 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.450439 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.450878 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.452692 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.453219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454074 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454621 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454788 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.454971 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455006 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455202 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455227 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455339 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455617 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455743 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.455900 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459722 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459751 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459797 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459810 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459754 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.459926 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.460094 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.465731 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466007 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466159 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466547 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.466925 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467180 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467405 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467450 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467643 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467789 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467801 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.467957 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468123 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468494 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468509 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.468660 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.469178 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.469383 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486964 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487282 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487419 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487581 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485796 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485945 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.487856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.485994 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486047 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488006 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486104 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486168 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488128 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.486904 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.488806 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.489164 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.489631 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.490376 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.490392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.491589 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.491621 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501246 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501357 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.501516 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.502723 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503610 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503865 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.503999 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504018 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504113 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504412 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504531 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504645 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504678 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504711 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.504760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.505369 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.505386 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.508561 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509037 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509320 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.509410 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.511585 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.511666 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.512189 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.514030 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.516570 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.516605 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.517088 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.523965 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.524112 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527065 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527173 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527757 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.527839 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.528851 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.530462 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.530559 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533135 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533316 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533362 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9vv9k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.533444 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.534141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.536158 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.541983 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.543928 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.548331 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.550729 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.551730 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.551875 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.553900 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.555931 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556292 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556410 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.556878 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.557579 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.558199 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.558949 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.559192 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.559953 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.560227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.561533 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.563904 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564222 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564592 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564642 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.564841 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.565267 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.565742 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.566748 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.567049 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.567502 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.568821 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.570295 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.570484 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572825 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.572924 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573302 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573430 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.573703 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.574001 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.574924 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.576005 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.577194 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.579137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.580408 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.581489 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.582693 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.583853 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.585039 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.586077 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.587202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.588146 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.590185 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.592215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.598166 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.600600 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.604124 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.606154 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.607346 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.608948 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609891 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609930 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609955 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.609980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610013 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610036 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610062 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610082 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610108 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610254 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610323 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610375 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610397 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610419 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610438 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610461 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610523 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610546 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610707 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610747 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610790 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610825 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610848 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610896 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610943 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610969 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.610994 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611025 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611046 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611047 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611081 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611258 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611295 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611360 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611413 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611432 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611449 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611470 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611602 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611692 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611715 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611750 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611826 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.611952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612018 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612032 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612053 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612074 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612096 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612071 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612120 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612245 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.612291 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613144 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613181 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613340 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.613989 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.614971 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.616110 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.617094 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.618058 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.618973 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.620302 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.621391 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.622978 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.624219 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.625101 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bzfz7"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.625928 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.626013 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.627205 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.627853 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.632776 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.644157 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.645031 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.652869 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.653334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.673607 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.693259 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.712085 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714527 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714580 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714601 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714750 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714832 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714851 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714869 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714891 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714923 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714952 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714970 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.714991 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715008 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715037 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715447 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-config\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715801 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715821 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715871 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715936 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.715989 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716012 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716049 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-policies\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716055 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716468 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716604 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716652 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716787 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716828 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716847 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716887 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716906 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716920 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716999 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717030 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717059 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717182 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717204 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717255 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717280 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717323 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-service-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717390 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717398 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717516 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-config\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.717883 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.718212 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.718568 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bed2a913-4f7d-4a64-aed8-a510280c9b6b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.719777 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.720873 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/040462f9-c464-41cc-8843-cac46b3da8bf-trusted-ca\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.716466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.721367 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6acff1e-cd79-44a7-bb48-1a79857b2a97-auth-proxy-config\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.721580 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.722340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-images\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.722767 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-etcd-client\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723099 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723406 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-serving-cert\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723432 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-audit-dir\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.723501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.724738 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-encryption-config\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725003 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04118e18-43d2-4aed-9812-aba776c0bf61-config\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725126 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09a0b780-3bf5-4607-9907-33e16ae4f098-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725299 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725361 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/040462f9-c464-41cc-8843-cac46b3da8bf-serving-cert\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.725803 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726082 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04118e18-43d2-4aed-9812-aba776c0bf61-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726419 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/491a1079-cbfa-470e-b91b-84e323ae0c6d-metrics-tls\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726541 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726530 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.726879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727458 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727690 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.727904 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bed2a913-4f7d-4a64-aed8-a510280c9b6b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728420 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-serving-cert\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d6acff1e-cd79-44a7-bb48-1a79857b2a97-machine-approver-tls\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.728946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.732678 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.740916 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-client\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.752533 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.773392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.784346 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93468e41-3e48-469f-90a9-7e05e45fe141-serving-cert\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.792663 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.796185 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-etcd-ca\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.812799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.832138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.838156 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93468e41-3e48-469f-90a9-7e05e45fe141-config\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.852198 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.872857 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.892251 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.912571 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.932436 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.952517 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.965467 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04125307-b213-4579-8042-92284900796b-metrics-tls\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.972922 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:07:11 crc kubenswrapper[4883]: I0310 09:07:11.998985 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.006994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04125307-b213-4579-8042-92284900796b-trusted-ca\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.012499 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.032620 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.072874 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.092181 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.112323 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.132588 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.152940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.158235 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.172873 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.192763 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.197277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-config\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.212513 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.252649 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.272142 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.292417 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.313156 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.332843 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.352274 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.372440 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.392913 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.412116 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.433266 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.452756 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.472951 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.492841 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.512326 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.532495 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.553150 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.571648 4883 request.go:700] Waited for 1.015040306s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.572789 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.592166 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.613079 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.632355 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.652628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.672985 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.692709 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.712927 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.733241 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.752862 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.772234 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.792576 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.812687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.832107 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.852689 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.872387 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.893075 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.913046 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.932076 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.952125 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.972203 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:07:12 crc kubenswrapper[4883]: I0310 09:07:12.992704 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.013016 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.032755 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.052342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.072189 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.093018 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.112071 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.133010 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.157700 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.172116 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.192014 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.213145 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.233116 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.252503 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.272452 4883 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.292937 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.312425 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.332422 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.353049 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.372351 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.391872 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.412230 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.432409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.452812 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.472573 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.492824 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.512573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.545647 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6gc\" (UniqueName: \"kubernetes.io/projected/040462f9-c464-41cc-8843-cac46b3da8bf-kube-api-access-lg6gc\") pod \"console-operator-58897d9998-42rrg\" (UID: \"040462f9-c464-41cc-8843-cac46b3da8bf\") " pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.566229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54tj\" (UniqueName: \"kubernetes.io/projected/09a0b780-3bf5-4607-9907-33e16ae4f098-kube-api-access-j54tj\") pod \"cluster-samples-operator-665b6dd947-stgnk\" (UID: \"09a0b780-3bf5-4607-9907-33e16ae4f098\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.571737 4883 request.go:700] Waited for 1.856552989s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.584283 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"oauth-openshift-558db77b4-m7986\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.601616 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.604694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cczd7\" (UniqueName: \"kubernetes.io/projected/d6acff1e-cd79-44a7-bb48-1a79857b2a97-kube-api-access-cczd7\") pod \"machine-approver-56656f9798-4bcv8\" (UID: \"d6acff1e-cd79-44a7-bb48-1a79857b2a97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.624615 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbqb\" (UniqueName: \"kubernetes.io/projected/4210a360-cb3e-4fa8-8fd1-98217c9b00f2-kube-api-access-2qbqb\") pod \"apiserver-7bbb656c7d-pfgjr\" (UID: \"4210a360-cb3e-4fa8-8fd1-98217c9b00f2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.644215 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.645098 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.656309 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.663552 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwk4d\" (UniqueName: \"kubernetes.io/projected/ca36a0b9-d7c9-4195-803b-53d41ac683d9-kube-api-access-bwk4d\") pod \"downloads-7954f5f757-69msk\" (UID: \"ca36a0b9-d7c9-4195-803b-53d41ac683d9\") " pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.669445 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.669625 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6acff1e_cd79_44a7_bb48_1a79857b2a97.slice/crio-c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d WatchSource:0}: Error finding container c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d: Status 404 returned error can't find the container with id c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.681759 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.687813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.706834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhr2\" (UniqueName: \"kubernetes.io/projected/491a1079-cbfa-470e-b91b-84e323ae0c6d-kube-api-access-6mhr2\") pod \"dns-operator-744455d44c-ppkhj\" (UID: \"491a1079-cbfa-470e-b91b-84e323ae0c6d\") " pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.727763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm62\" (UniqueName: \"kubernetes.io/projected/3de74a75-4aa1-46dd-ae5b-5c82b91811e5-kube-api-access-btm62\") pod \"machine-api-operator-5694c8668f-7clc9\" (UID: \"3de74a75-4aa1-46dd-ae5b-5c82b91811e5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.744449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9kh\" (UniqueName: \"kubernetes.io/projected/04125307-b213-4579-8042-92284900796b-kube-api-access-7b9kh\") pod \"ingress-operator-5b745b69d9-76h4k\" (UID: \"04125307-b213-4579-8042-92284900796b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.766193 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcx47\" (UniqueName: \"kubernetes.io/projected/bed2a913-4f7d-4a64-aed8-a510280c9b6b-kube-api-access-lcx47\") pod \"cluster-image-registry-operator-dc59b4c8b-v75tm\" (UID: \"bed2a913-4f7d-4a64-aed8-a510280c9b6b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.778945 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-42rrg"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.786678 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmprt\" (UniqueName: \"kubernetes.io/projected/04a7ee07-f81d-4e5a-aeea-b399aa39a31c-kube-api-access-vmprt\") pod \"openshift-config-operator-7777fb866f-d7b5j\" (UID: \"04a7ee07-f81d-4e5a-aeea-b399aa39a31c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.789662 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod040462f9_c464_41cc_8843_cac46b3da8bf.slice/crio-d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa WatchSource:0}: Error finding container d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa: Status 404 returned error can't find the container with id d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.801113 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.802835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xknbc\" (UniqueName: \"kubernetes.io/projected/93468e41-3e48-469f-90a9-7e05e45fe141-kube-api-access-xknbc\") pod \"etcd-operator-b45778765-4sznb\" (UID: \"93468e41-3e48-469f-90a9-7e05e45fe141\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:13 crc kubenswrapper[4883]: W0310 09:07:13.805694 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4210a360_cb3e_4fa8_8fd1_98217c9b00f2.slice/crio-de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa WatchSource:0}: Error finding container de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa: Status 404 returned error can't find the container with id de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.824833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"controller-manager-879f6c89f-pzn5l\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.828946 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-69msk"] Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.846968 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"console-f9d7485db-nbvf4\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.860772 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.864004 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pnh\" (UniqueName: \"kubernetes.io/projected/04118e18-43d2-4aed-9812-aba776c0bf61-kube-api-access-d8pnh\") pod \"openshift-apiserver-operator-796bbdcf4f-fjdx9\" (UID: \"04118e18-43d2-4aed-9812-aba776c0bf61\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.865899 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.878099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.885261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sfsrd\" (UID: \"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.890534 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.926747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerStarted","Data":"de4450a201dc516a6ff019c78a5584d2658c919f722b342324e54eb5dc7c53aa"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.928402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-69msk" event={"ID":"ca36a0b9-d7c9-4195-803b-53d41ac683d9","Type":"ContainerStarted","Data":"de329dc36faddd60c64b1dedfa63d8ddca8d095dee912c9e5a31c84a00e1c11c"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.928440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-69msk" event={"ID":"ca36a0b9-d7c9-4195-803b-53d41ac683d9","Type":"ContainerStarted","Data":"22f8d183eb153b2ddb33201c0ba167d5aa42fd502fa549946bc80a934e74c703"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.929453 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.934115 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.934142 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.938819 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"059bd85844f9b25014ac9c4e79a79d2232e103f2f10f9db59e7c5a5f37f09c50"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.938848 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"c70a33bb3187dfa992af916aeaf8ced911bbee6b9fdd7a934eaa3610f16c201d"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942668 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942700 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942717 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942765 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942781 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942798 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942861 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942897 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942914 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.942981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943000 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943019 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943081 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943110 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943250 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943269 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943325 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943405 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943422 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943457 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943565 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943607 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943773 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.943809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:13 crc kubenswrapper[4883]: E0310 09:07:13.943949 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.443937002 +0000 UTC m=+220.698834891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.945725 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-42rrg" event={"ID":"040462f9-c464-41cc-8843-cac46b3da8bf","Type":"ContainerStarted","Data":"6f8cc6d827bc39c1c8ed710e9a87c7da199a7cac4ffbb7438cd44cda65d0f820"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.945750 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-42rrg" event={"ID":"040462f9-c464-41cc-8843-cac46b3da8bf","Type":"ContainerStarted","Data":"d0223f42f5794235bdc257e8a1d33d6a24f61139c5258c9520410fdbf1e6dcfa"} Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.946049 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.947945 4883 patch_prober.go:28] interesting pod/console-operator-58897d9998-42rrg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.947979 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-42rrg" podUID="040462f9-c464-41cc-8843-cac46b3da8bf" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.963760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.975965 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.988130 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" Mar 10 09:07:13 crc kubenswrapper[4883]: I0310 09:07:13.993212 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.021014 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.029933 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.033948 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.043868 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.045722 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.045903 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.545882691 +0000 UTC m=+220.800780580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050555 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050851 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.050975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051001 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051023 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051045 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051063 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051116 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051135 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051224 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051289 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051307 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051341 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051403 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051418 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051436 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051466 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051527 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051552 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051567 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051584 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.051605 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053743 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053778 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053798 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053823 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053882 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.053906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054191 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054222 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054245 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054265 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054309 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054329 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-image-import-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054420 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054490 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054536 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054560 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054591 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054647 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054668 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054689 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.054945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9ea088-9f19-4839-bfe4-ce54842b04c2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.055642 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.555628266 +0000 UTC m=+220.810526155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056588 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056661 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-serving-ca\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.056884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-service-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058821 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058867 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058905 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058923 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058952 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.058969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059028 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059132 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.059995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-config\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.060200 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061718 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061786 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.061889 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-trusted-ca-bundle\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062036 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062123 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062142 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062205 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062223 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062291 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062308 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062325 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062342 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062359 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062420 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062439 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.062534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.063322 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.064854 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-audit-dir\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065119 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065152 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.065772 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066467 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066517 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066538 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066585 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066713 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d94eaa88-cfd0-497d-804d-922ebd316b33-audit\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.066773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.067313 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.069723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94eaa88-cfd0-497d-804d-922ebd316b33-node-pullsecrets\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.070362 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed479632-f556-407c-a8a9-b40379bbf549-config\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.071207 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed479632-f556-407c-a8a9-b40379bbf549-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.071900 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.072239 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ba6ab17-ada9-4712-bc66-09172d648791-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.072980 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9ea088-9f19-4839-bfe4-ce54842b04c2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.075448 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-etcd-client\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.075614 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.076997 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.082239 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-serving-cert\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.082841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ba6ab17-ada9-4712-bc66-09172d648791-serving-cert\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.083230 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.084765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.086709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.090924 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d94eaa88-cfd0-497d-804d-922ebd316b33-encryption-config\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.091617 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.094624 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp2gg\" (UniqueName: \"kubernetes.io/projected/7ba6ab17-ada9-4712-bc66-09172d648791-kube-api-access-gp2gg\") pod \"authentication-operator-69f744f599-29pxk\" (UID: \"7ba6ab17-ada9-4712-bc66-09172d648791\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.107651 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb4j7\" (UniqueName: \"kubernetes.io/projected/8d9ea088-9f19-4839-bfe4-ce54842b04c2-kube-api-access-tb4j7\") pod \"openshift-controller-manager-operator-756b6f6bc6-dqr5n\" (UID: \"8d9ea088-9f19-4839-bfe4-ce54842b04c2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.125136 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5n5\" (UniqueName: \"kubernetes.io/projected/d94eaa88-cfd0-497d-804d-922ebd316b33-kube-api-access-zl5n5\") pod \"apiserver-76f77b778f-h5tmh\" (UID: \"d94eaa88-cfd0-497d-804d-922ebd316b33\") " pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.138262 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a7ee07_f81d_4e5a_aeea_b399aa39a31c.slice/crio-5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6 WatchSource:0}: Error finding container 5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6: Status 404 returned error can't find the container with id 5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6 Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.153528 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed479632-f556-407c-a8a9-b40379bbf549-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ldtzk\" (UID: \"ed479632-f556-407c-a8a9-b40379bbf549\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.155495 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.168876 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.169113 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.669080053 +0000 UTC m=+220.923977942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169434 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169492 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169558 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169582 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169600 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169618 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.169692 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170222 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170254 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170279 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170338 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170416 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170438 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170456 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170658 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170682 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170700 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170750 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170770 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170849 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170883 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170919 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.170971 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171003 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171022 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171041 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171115 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.171494 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.172182 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-registration-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.172658 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/57fad383-2bee-48b1-b513-32a629c976aa-proxy-tls\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-cabundle\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174624 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.174948 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89e1c086-5372-40ce-859d-3eb64bb06012-tmpfs\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.175248 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9409438-97ce-43a6-8a7f-24764925eb53-config-volume\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.176710 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177331 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177390 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-socket-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.177600 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-mountpoint-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.178607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179180 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-csi-data-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179343 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.179488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.179899 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.679882286 +0000 UTC m=+220.934780176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-plugins-dir\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180415 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6fb9cd04-d1cb-446b-9bab-b054c51df85c-images\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180538 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.180743 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6fb9cd04-d1cb-446b-9bab-b054c51df85c-proxy-tls\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f81e48af-a943-4b68-b259-3c0685529d42-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187976 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9409438-97ce-43a6-8a7f-24764925eb53-metrics-tls\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.187978 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.189153 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-profile-collector-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191016 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-node-bootstrap-token\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191552 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-certs\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.191793 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.192207 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/57fad383-2bee-48b1-b513-32a629c976aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.193232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-apiservice-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.195331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89e1c086-5372-40ce-859d-3eb64bb06012-webhook-cert\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.195729 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196093 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/828338b4-f6a3-4a38-9596-2556459de30a-serving-cert\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196118 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec510e9-f96b-44da-abec-7d49115d0c83-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/828338b4-f6a3-4a38-9596-2556459de30a-config\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.196825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf87b69c-5c1e-4297-82c9-ff39bf48b628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.197168 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.199345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf87b69c-5c1e-4297-82c9-ff39bf48b628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201043 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201094 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-stats-auth\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-srv-cert\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459d25fc-b392-4a73-bfce-6250fc05c6e4-cert\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.201835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-default-certificate\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.202422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e56425c4-e04a-4313-a946-efc4ddac49ee-service-ca-bundle\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.210269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c05291a-8935-4f5e-81c8-4523b3b7e558-srv-cert\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.211149 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/854dee0a-96a6-41f9-bdbe-d0d820684605-signing-key\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.211816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.212355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e56425c4-e04a-4313-a946-efc4ddac49ee-metrics-certs\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.217513 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"route-controller-manager-6576b87f9c-76t2f\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.218126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.218408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad5ba8bd-51b6-42ca-94d3-eaa634ba6270-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ztqtm\" (UID: \"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.230246 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.231855 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.268853 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shgc\" (UniqueName: \"kubernetes.io/projected/57fad383-2bee-48b1-b513-32a629c976aa-kube-api-access-7shgc\") pod \"machine-config-controller-84d6567774-sldgp\" (UID: \"57fad383-2bee-48b1-b513-32a629c976aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.281798 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.282251 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.782237867 +0000 UTC m=+221.037135757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.290518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zhd\" (UniqueName: \"kubernetes.io/projected/7ec510e9-f96b-44da-abec-7d49115d0c83-kube-api-access-t7zhd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dlh8b\" (UID: \"7ec510e9-f96b-44da-abec-7d49115d0c83\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.292943 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7clc9"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.304259 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.317529 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"auto-csr-approver-29552226-jp7d9\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.323827 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.329768 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6sv\" (UniqueName: \"kubernetes.io/projected/854dee0a-96a6-41f9-bdbe-d0d820684605-kube-api-access-kc6sv\") pod \"service-ca-9c57cc56f-5fqgx\" (UID: \"854dee0a-96a6-41f9-bdbe-d0d820684605\") " pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.340811 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.348934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8qw\" (UniqueName: \"kubernetes.io/projected/f81e48af-a943-4b68-b259-3c0685529d42-kube-api-access-4d8qw\") pod \"package-server-manager-789f6589d5-h6zbd\" (UID: \"f81e48af-a943-4b68-b259-3c0685529d42\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.364854 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4sznb"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.367551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhbbx\" (UniqueName: \"kubernetes.io/projected/6e2199dc-f886-4cde-aab8-60f4e4823840-kube-api-access-vhbbx\") pod \"migrator-59844c95c7-sw994\" (UID: \"6e2199dc-f886-4cde-aab8-60f4e4823840\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.373009 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.377178 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.385834 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.386624 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.886604674 +0000 UTC m=+221.141502563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.392151 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47s58\" (UniqueName: \"kubernetes.io/projected/c134b64a-2ccb-4343-8fe3-5202c5a9b8e7-kube-api-access-47s58\") pod \"csi-hostpathplugin-kzsbn\" (UID: \"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7\") " pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.409722 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhgt\" (UniqueName: \"kubernetes.io/projected/6fb9cd04-d1cb-446b-9bab-b054c51df85c-kube-api-access-jmhgt\") pod \"machine-config-operator-74547568cd-wpm99\" (UID: \"6fb9cd04-d1cb-446b-9bab-b054c51df85c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.414741 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.434255 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llg2x\" (UniqueName: \"kubernetes.io/projected/a47e8310-1d4f-4eba-a49f-adc4d45e7dbb-kube-api-access-llg2x\") pod \"machine-config-server-bzfz7\" (UID: \"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb\") " pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.435619 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.453017 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbbw\" (UniqueName: \"kubernetes.io/projected/828338b4-f6a3-4a38-9596-2556459de30a-kube-api-access-zhbbw\") pod \"service-ca-operator-777779d784-x5zf2\" (UID: \"828338b4-f6a3-4a38-9596-2556459de30a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.464710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.466851 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbf6q\" (UniqueName: \"kubernetes.io/projected/cf87b69c-5c1e-4297-82c9-ff39bf48b628-kube-api-access-tbf6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-pbmvx\" (UID: \"cf87b69c-5c1e-4297-82c9-ff39bf48b628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.477837 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-h5tmh"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.486891 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.487118 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.987079354 +0000 UTC m=+221.241977243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.487181 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.487709 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:14.987701003 +0000 UTC m=+221.242598893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.506391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"collect-profiles-29552220-k6w5g\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.507275 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ppkhj"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.509286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"marketplace-operator-79b997595-ndt59\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.513414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.516323 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.519414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.523831 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.526161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bml5\" (UniqueName: \"kubernetes.io/projected/6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9-kube-api-access-9bml5\") pod \"olm-operator-6b444d44fb-6vzn4\" (UID: \"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.526456 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.540610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-29pxk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.547280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d7p\" (UniqueName: \"kubernetes.io/projected/e56425c4-e04a-4313-a946-efc4ddac49ee-kube-api-access-h2d7p\") pod \"router-default-5444994796-9vv9k\" (UID: \"e56425c4-e04a-4313-a946-efc4ddac49ee\") " pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.556667 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bzfz7" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.570836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnqz\" (UniqueName: \"kubernetes.io/projected/e9409438-97ce-43a6-8a7f-24764925eb53-kube-api-access-7mnqz\") pod \"dns-default-x6pxw\" (UID: \"e9409438-97ce-43a6-8a7f-24764925eb53\") " pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.580692 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.585680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmxb\" (UniqueName: \"kubernetes.io/projected/89e1c086-5372-40ce-859d-3eb64bb06012-kube-api-access-ggmxb\") pod \"packageserver-d55dfcdfc-7thqp\" (UID: \"89e1c086-5372-40ce-859d-3eb64bb06012\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.588757 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.589139 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.089127376 +0000 UTC m=+221.344025264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.602447 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd173309_9e96_468f_a21c_f25c86186744.slice/crio-4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa WatchSource:0}: Error finding container 4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa: Status 404 returned error can't find the container with id 4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.617117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksbw\" (UniqueName: \"kubernetes.io/projected/fbb98199-ce3b-4a19-bc11-a4c55d8e8df2-kube-api-access-tksbw\") pod \"multus-admission-controller-857f4d67dd-dh2nm\" (UID: \"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.626324 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftn9\" (UniqueName: \"kubernetes.io/projected/3c05291a-8935-4f5e-81c8-4523b3b7e558-kube-api-access-8ftn9\") pod \"catalog-operator-68c6474976-mg7tt\" (UID: \"3c05291a-8935-4f5e-81c8-4523b3b7e558\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.648633 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-245mx\" (UniqueName: \"kubernetes.io/projected/459d25fc-b392-4a73-bfce-6250fc05c6e4-kube-api-access-245mx\") pod \"ingress-canary-5nj7x\" (UID: \"459d25fc-b392-4a73-bfce-6250fc05c6e4\") " pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.653442 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.660208 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.667630 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.679219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.681403 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.690052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.690397 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.19038542 +0000 UTC m=+221.445283309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.690747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.692411 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.692675 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.701657 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.720487 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.749744 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5fqgx"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.752467 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.758638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.770356 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.793517 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.794007 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.794430 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.294414971 +0000 UTC m=+221.549312860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.858796 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.873180 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nj7x" Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.895052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.895670 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.39564877 +0000 UTC m=+221.650546659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: W0310 09:07:14.945597 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d9ea088_9f19_4839_bfe4_ce54842b04c2.slice/crio-5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f WatchSource:0}: Error finding container 5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f: Status 404 returned error can't find the container with id 5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.966687 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp"] Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.980371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"3810422895eae394c04d341d1128429a7cf24177c1a49c9c0349e7e7c5fa4708"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.989862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"06786ae24629774a0dc2a2be764632db9ec9884c67613486c9c06887960c9d07"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.992260 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerStarted","Data":"4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.993357 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" event={"ID":"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001","Type":"ContainerStarted","Data":"04efbd56e5c4592c84ed49edf78939f30a9373f93324a0e375d863fde6c3f610"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.994380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"5fc42ea173c194caf3bbe9755d873f9b26e7a45998273ec4e4aaf788c0cba26c"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.994466 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"13023a5482ddf6991a949dcf22f69370474d9371fad5a2db16cd534be65c0f68"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.995610 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:14 crc kubenswrapper[4883]: E0310 09:07:14.996125 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.496109123 +0000 UTC m=+221.751007012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.996244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" event={"ID":"7ba6ab17-ada9-4712-bc66-09172d648791","Type":"ContainerStarted","Data":"4c975cdc2a429849e34dce8ac1c1d4d5a27e4fa7ddde84fa00f2598f927d599d"} Mar 10 09:07:14 crc kubenswrapper[4883]: I0310 09:07:14.997439 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" event={"ID":"ed479632-f556-407c-a8a9-b40379bbf549","Type":"ContainerStarted","Data":"e42417f0cfbbffbd4669b984cc60fe516dc68b01da288b467b4a7a0cdf8d7c49"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.001412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" event={"ID":"bed2a913-4f7d-4a64-aed8-a510280c9b6b","Type":"ContainerStarted","Data":"b2fc80474c3305f3d579cf662d18d6f3cc4eeba384f56feddbc6fb18da99545c"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.010390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"322158f2c015958bb071cc6d007c875789a46a12002650e251b7a67f4fa5997f"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.014751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerStarted","Data":"c26a515962952b0dca378df9d0df683fab142453ca7aa14be72f83e3e38823fb"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016465 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerStarted","Data":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerStarted","Data":"4fcf975f26107b7cfd1ff1be2d34f1e281e19924c7820362af5907d5ba2ac3dc"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.016943 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.019367 4883 generic.go:334] "Generic (PLEG): container finished" podID="4210a360-cb3e-4fa8-8fd1-98217c9b00f2" containerID="dd07948b85920c5ca7cf689b80b0158f2e9b7a1ad1f4fccb474033b2a0795aa1" exitCode=0 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.019412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerDied","Data":"dd07948b85920c5ca7cf689b80b0158f2e9b7a1ad1f4fccb474033b2a0795aa1"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.053911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" event={"ID":"93468e41-3e48-469f-90a9-7e05e45fe141","Type":"ContainerStarted","Data":"1c4c09e1af0a2fb9b657472a09c396cd2dc93c4029029e61abddf874ffd1ac54"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.053943 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" event={"ID":"93468e41-3e48-469f-90a9-7e05e45fe141","Type":"ContainerStarted","Data":"923cdd4f50b4442178c8bdca2b5c8bb103c402fab8b69481910326d230d89c50"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.057172 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" event={"ID":"8d9ea088-9f19-4839-bfe4-ce54842b04c2","Type":"ContainerStarted","Data":"5deff5955f79c845cb7848a5a7bca7e8ca0de9262273a1ce4a777ee7c5090c7f"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.068419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" event={"ID":"d6acff1e-cd79-44a7-bb48-1a79857b2a97","Type":"ContainerStarted","Data":"1c4bfc5429a2aa7d8ca472e0f699f74024464d12512f6ea704c074a5215af6ad"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078448 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerStarted","Data":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078556 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerStarted","Data":"4c062e8bdd69b4e921c05bdb270d650295db96c62cdadbfa314f1f418088417e"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.078955 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.081895 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34602: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.085954 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" event={"ID":"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270","Type":"ContainerStarted","Data":"a72fce3434ade8bf0096008cd0a6271f939804ef40b3f753cb54190e4d8d1a6d"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.086948 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.088330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" event={"ID":"854dee0a-96a6-41f9-bdbe-d0d820684605","Type":"ContainerStarted","Data":"ce7d9665ad34b2597288c408a2e16f9d20c91e90fb0d0709fc942937d2f02355"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093866 4883 generic.go:334] "Generic (PLEG): container finished" podID="04a7ee07-f81d-4e5a-aeea-b399aa39a31c" containerID="9f50c7f85e628309d8c981873187d7b98ebd10d7e244c1a7a602685cbbbca279" exitCode=0 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerDied","Data":"9f50c7f85e628309d8c981873187d7b98ebd10d7e244c1a7a602685cbbbca279"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.093924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerStarted","Data":"5fe2acbfc74cb4e318cab50fef7688fbdf7157e5096d64dd57c1d234af6fa2f6"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.097298 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.098997 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.598982889 +0000 UTC m=+221.853880778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.100726 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"73a26b0f18d24208d01caab586192f525c163e6edc9825ceaa5693d09d151dda"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.100820 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"c2d528bc35131fc5a84d816a34ae1e57e68d7a2d6cf1870754994066c9c81a85"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.147298 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" event={"ID":"04118e18-43d2-4aed-9812-aba776c0bf61","Type":"ContainerStarted","Data":"6d967f23f52a9fe345bd815805ce396d68c4d0c16172123b280d34c14caadfc7"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.147348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" event={"ID":"04118e18-43d2-4aed-9812-aba776c0bf61","Type":"ContainerStarted","Data":"c71f34b23d0938af07832d6026cab9c92192f4492913028b07ad8294e3ab5ff9"} Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.148178 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.148216 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.175918 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-42rrg" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.198848 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34616: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.204953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.210923 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.710903333 +0000 UTC m=+221.965801223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.267330 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.288875 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34624: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.289226 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.311206 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.313592 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.813578265 +0000 UTC m=+222.068476155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.319938 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.321690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kzsbn"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.392614 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34632: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.419108 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.419229 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.919203912 +0000 UTC m=+222.174101801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.419454 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.419780 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:15.91977116 +0000 UTC m=+222.174669038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.451648 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.459642 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81e48af_a943_4b68_b259_3c0685529d42.slice/crio-333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9 WatchSource:0}: Error finding container 333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9: Status 404 returned error can't find the container with id 333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9 Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.489992 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec510e9_f96b_44da_abec_7d49115d0c83.slice/crio-53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b WatchSource:0}: Error finding container 53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b: Status 404 returned error can't find the container with id 53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.496938 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34648: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.510688 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.521672 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.522211 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.022194657 +0000 UTC m=+222.277092547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.556090 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dh2nm"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.607092 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-fjdx9" podStartSLOduration=193.607069043 podStartE2EDuration="3m13.607069043s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.559810043 +0000 UTC m=+221.814707933" watchObservedRunningTime="2026-03-10 09:07:15.607069043 +0000 UTC m=+221.861966932" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.610040 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nj7x"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.612518 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34652: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.615297 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx"] Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.623384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.623732 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.123716499 +0000 UTC m=+222.378614377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.661894 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4sznb" podStartSLOduration=193.661872131 podStartE2EDuration="3m13.661872131s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.653122712 +0000 UTC m=+221.908020601" watchObservedRunningTime="2026-03-10 09:07:15.661872131 +0000 UTC m=+221.916770021" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.724783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.725887 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.225755584 +0000 UTC m=+222.480653483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.726382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.726705 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.226694501 +0000 UTC m=+222.481592390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: W0310 09:07:15.739061 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb98199_ce3b_4a19_bc11_a4c55d8e8df2.slice/crio-e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3 WatchSource:0}: Error finding container e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3: Status 404 returned error can't find the container with id e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3 Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.825907 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34664: no serving certificate available for the kubelet" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.828982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.829404 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.329386375 +0000 UTC m=+222.584284264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.877140 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-69msk" podStartSLOduration=193.877125647 podStartE2EDuration="3m13.877125647s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.876142768 +0000 UTC m=+222.131040656" watchObservedRunningTime="2026-03-10 09:07:15.877125647 +0000 UTC m=+222.132023536" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.931251 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:15 crc kubenswrapper[4883]: E0310 09:07:15.932362 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.432346093 +0000 UTC m=+222.687243982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.966034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4bcv8" podStartSLOduration=193.966011242 podStartE2EDuration="3m13.966011242s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:15.948390714 +0000 UTC m=+222.203288603" watchObservedRunningTime="2026-03-10 09:07:15.966011242 +0000 UTC m=+222.220909132" Mar 10 09:07:15 crc kubenswrapper[4883]: I0310 09:07:15.969211 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.014328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.038350 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.038880 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.538864057 +0000 UTC m=+222.793761947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.068636 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-42rrg" podStartSLOduration=194.068614489 podStartE2EDuration="3m14.068614489s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.028794654 +0000 UTC m=+222.283692543" watchObservedRunningTime="2026-03-10 09:07:16.068614489 +0000 UTC m=+222.323512378" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.070534 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4"] Mar 10 09:07:16 crc kubenswrapper[4883]: W0310 09:07:16.085589 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828338b4_f6a3_4a38_9596_2556459de30a.slice/crio-86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d WatchSource:0}: Error finding container 86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d: Status 404 returned error can't find the container with id 86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.118348 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.144611 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.145011 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.645000285 +0000 UTC m=+222.899898174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.206577 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerStarted","Data":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.221662 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"61baee90e8b438025e934296e5a650a7588c153da10769e3db60481de3622944"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.224877 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x6pxw"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.246126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.246584 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.746570336 +0000 UTC m=+223.001468226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.267261 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34676: no serving certificate available for the kubelet" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.267833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" event={"ID":"8c698d2a-4a9b-4e6a-9b20-3b3c2ca2e001","Type":"ContainerStarted","Data":"91aa8a80f7a942bed2a8a74ac537a8962d5fb40a52ca30f835d0a4d9536c78d8"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.300371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" event={"ID":"7ba6ab17-ada9-4712-bc66-09172d648791","Type":"ContainerStarted","Data":"a1365eacac9555d17db63c4bb03627cf9490e95ce606ef4451640160334e1ed5"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.318865 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.326860 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" event={"ID":"828338b4-f6a3-4a38-9596-2556459de30a","Type":"ContainerStarted","Data":"86718298f8d5d1fae8fd62a6bc92aeb4ee6024d977db73ae2f5f882199fca13d"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.333992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"fe5b2595f3d0c0bff90d35e233f13461651357c92a710f6d2c67899a77075aae"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.334034 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"7b8babb49c8f6284579884f142ce0d9eb11e924b3d290da9f46fb58258ccc552"} Mar 10 09:07:16 crc kubenswrapper[4883]: W0310 09:07:16.339609 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9409438_97ce_43a6_8a7f_24764925eb53.slice/crio-32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6 WatchSource:0}: Error finding container 32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6: Status 404 returned error can't find the container with id 32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6 Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.348021 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.350104 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.850093054 +0000 UTC m=+223.104990943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.370683 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"333212c33e5dd63b5421066ce6179e8df37301ffb9110ba9c5b0e9c5330a01b9"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.386918 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerStarted","Data":"de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.409783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9vv9k" event={"ID":"e56425c4-e04a-4313-a946-efc4ddac49ee","Type":"ContainerStarted","Data":"8f169b8b6f5fb205e9aa9844776c9cb7bb1c7eec9edcee3d35102d75c42337fa"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.412430 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.421034 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.438432 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" event={"ID":"09a0b780-3bf5-4607-9907-33e16ae4f098","Type":"ContainerStarted","Data":"a0f3b7efe8911bf9d9034a64956d059c8fc9413cd2a119b82665786612bca0d7"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.449446 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.450612 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.950598883 +0000 UTC m=+223.205496773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.450762 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.451793 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:16.951785727 +0000 UTC m=+223.206683616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.485723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" event={"ID":"bed2a913-4f7d-4a64-aed8-a510280c9b6b","Type":"ContainerStarted","Data":"d814627899b16948e8f7553b870f65a92008d7092c8bab34fb02d5a7b27d6dd3"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.497238 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.515562 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bzfz7" event={"ID":"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb","Type":"ContainerStarted","Data":"f7c83fe73e51ea54e398d880432b376da1707bc93dd19c084d22ee954413ba08"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.517958 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"d5461303fb0081511ff9d46b0acfb4e13f7e46d3169f64d643fba42cfa571529"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.531435 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" podStartSLOduration=194.531419487 podStartE2EDuration="3m14.531419487s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.509146802 +0000 UTC m=+222.764044691" watchObservedRunningTime="2026-03-10 09:07:16.531419487 +0000 UTC m=+222.786317377" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.533076 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"4c5f20570e22466208fc0ff8da3ccc018d409605de288a6887be0d64a2a3364f"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.534301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" event={"ID":"04125307-b213-4579-8042-92284900796b","Type":"ContainerStarted","Data":"d26b7504a5b7a8938a38786ee5deaca63c220462d0424a2067493e687ecf6846"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.534409 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"d723cb7d45e224806bacb387153460074d6ee328c83925e0765b2da14178fc7f"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.547605 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"e4984f5d6447b59945f2f4032d8994328bdcb77ec70f7983a812672eca1f58f3"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.551623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.552592 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.052578266 +0000 UTC m=+223.307476154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.564171 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" event={"ID":"854dee0a-96a6-41f9-bdbe-d0d820684605","Type":"ContainerStarted","Data":"7ea2d5c378d4fff852d4edd0fe20824276dc100d9936799346d55f54776bf476"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.623316 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nj7x" event={"ID":"459d25fc-b392-4a73-bfce-6250fc05c6e4","Type":"ContainerStarted","Data":"ccf6dd7a024cec4914a4d8426a08e5a3d685a525ec697953b9a21ccfe71f8a88"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.645666 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerStarted","Data":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.647550 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.655720 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.658310 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.158297067 +0000 UTC m=+223.413194955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.668306 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" event={"ID":"cf87b69c-5c1e-4297-82c9-ff39bf48b628","Type":"ContainerStarted","Data":"1781318da7f0e95081973e6cbcd4b2f78521117f06ab141da6e3a67f40c29484"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.669498 4883 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76t2f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.669533 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.702557 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.722503 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" podStartSLOduration=194.722468321 podStartE2EDuration="3m14.722468321s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.721595717 +0000 UTC m=+222.976493606" watchObservedRunningTime="2026-03-10 09:07:16.722468321 +0000 UTC m=+222.977366210" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.745434 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" event={"ID":"3de74a75-4aa1-46dd-ae5b-5c82b91811e5","Type":"ContainerStarted","Data":"641f35eaaa1fca530ef6bb4774bf889868ee76f008b7fba79c8a3b9564104bcb"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.757290 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.760348 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.260323718 +0000 UTC m=+223.515221606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.765212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" event={"ID":"7ec510e9-f96b-44da-abec-7d49115d0c83","Type":"ContainerStarted","Data":"53b2048ec91d16049f11459ce8cc579f19d97009bd12bb6aa206cfde30f5952b"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.777391 4883 generic.go:334] "Generic (PLEG): container finished" podID="d94eaa88-cfd0-497d-804d-922ebd316b33" containerID="e9bc417da75478fa9167ebb9aee182ee2c4e5f0223fffec129e37b685bde91c7" exitCode=0 Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.777903 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerDied","Data":"e9bc417da75478fa9167ebb9aee182ee2c4e5f0223fffec129e37b685bde91c7"} Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.784623 4883 patch_prober.go:28] interesting pod/downloads-7954f5f757-69msk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.784689 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-69msk" podUID="ca36a0b9-d7c9-4195-803b-53d41ac683d9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.817691 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nbvf4" podStartSLOduration=194.817666949 podStartE2EDuration="3m14.817666949s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.78496941 +0000 UTC m=+223.039867300" watchObservedRunningTime="2026-03-10 09:07:16.817666949 +0000 UTC m=+223.072564838" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.835046 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podStartSLOduration=194.835010846 podStartE2EDuration="3m14.835010846s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.83061419 +0000 UTC m=+223.085512079" watchObservedRunningTime="2026-03-10 09:07:16.835010846 +0000 UTC m=+223.089908735" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.868844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.873581 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.373549741 +0000 UTC m=+223.628447630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.882138 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-stgnk" podStartSLOduration=194.882120955 podStartE2EDuration="3m14.882120955s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.87499444 +0000 UTC m=+223.129892329" watchObservedRunningTime="2026-03-10 09:07:16.882120955 +0000 UTC m=+223.137018844" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.921555 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" podStartSLOduration=194.921530738 podStartE2EDuration="3m14.921530738s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.91978405 +0000 UTC m=+223.174681929" watchObservedRunningTime="2026-03-10 09:07:16.921530738 +0000 UTC m=+223.176428627" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.962641 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" podStartSLOduration=194.962621486 podStartE2EDuration="3m14.962621486s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:16.961142212 +0000 UTC m=+223.216040101" watchObservedRunningTime="2026-03-10 09:07:16.962621486 +0000 UTC m=+223.217519376" Mar 10 09:07:16 crc kubenswrapper[4883]: I0310 09:07:16.978721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:16 crc kubenswrapper[4883]: E0310 09:07:16.979241 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.479227725 +0000 UTC m=+223.734125614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.000604 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34682: no serving certificate available for the kubelet" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.092752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.093285 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.593273961 +0000 UTC m=+223.848171851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.099604 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sfsrd" podStartSLOduration=195.099587015 podStartE2EDuration="3m15.099587015s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.00270632 +0000 UTC m=+223.257604209" watchObservedRunningTime="2026-03-10 09:07:17.099587015 +0000 UTC m=+223.354484904" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.100144 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5fqgx" podStartSLOduration=195.100139345 podStartE2EDuration="3m15.100139345s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.081396523 +0000 UTC m=+223.336294413" watchObservedRunningTime="2026-03-10 09:07:17.100139345 +0000 UTC m=+223.355037234" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.167490 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bzfz7" podStartSLOduration=6.167448545 podStartE2EDuration="6.167448545s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.153127818 +0000 UTC m=+223.408025708" watchObservedRunningTime="2026-03-10 09:07:17.167448545 +0000 UTC m=+223.422346424" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.195272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.195737 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.695721885 +0000 UTC m=+223.950619774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.261960 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v75tm" podStartSLOduration=195.261942517 podStartE2EDuration="3m15.261942517s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.254161971 +0000 UTC m=+223.509059861" watchObservedRunningTime="2026-03-10 09:07:17.261942517 +0000 UTC m=+223.516840406" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.263110 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-29pxk" podStartSLOduration=195.263104003 podStartE2EDuration="3m15.263104003s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.213104337 +0000 UTC m=+223.468002225" watchObservedRunningTime="2026-03-10 09:07:17.263104003 +0000 UTC m=+223.518001892" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.298241 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.298588 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.798574142 +0000 UTC m=+224.053472021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.307461 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-76h4k" podStartSLOduration=195.307442485 podStartE2EDuration="3m15.307442485s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.30431638 +0000 UTC m=+223.559214269" watchObservedRunningTime="2026-03-10 09:07:17.307442485 +0000 UTC m=+223.562340375" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.400523 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.401083 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:17.901061771 +0000 UTC m=+224.155959660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.401733 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7clc9" podStartSLOduration=195.401715671 podStartE2EDuration="3m15.401715671s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.399790878 +0000 UTC m=+223.654688768" watchObservedRunningTime="2026-03-10 09:07:17.401715671 +0000 UTC m=+223.656613560" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.401844 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" podStartSLOduration=195.401838503 podStartE2EDuration="3m15.401838503s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.370722561 +0000 UTC m=+223.625620450" watchObservedRunningTime="2026-03-10 09:07:17.401838503 +0000 UTC m=+223.656736392" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.448940 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.448988 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.504413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.504753 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.004740152 +0000 UTC m=+224.259638041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.606058 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.606776 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.106761262 +0000 UTC m=+224.361659150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.609349 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.675028 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" podStartSLOduration=195.675010712 podStartE2EDuration="3m15.675010712s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.485075214 +0000 UTC m=+223.739973103" watchObservedRunningTime="2026-03-10 09:07:17.675010712 +0000 UTC m=+223.929908602" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.710596 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.711950 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.211929256 +0000 UTC m=+224.466827145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.814798 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.815524 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.315505836 +0000 UTC m=+224.570403724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.848693 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" event={"ID":"89e1c086-5372-40ce-859d-3eb64bb06012","Type":"ContainerStarted","Data":"de640c1d204a6a6bc9d8006403f61dc265c51a63476abfb40f9929189e3047c5"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.848748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" event={"ID":"89e1c086-5372-40ce-859d-3eb64bb06012","Type":"ContainerStarted","Data":"bc1a39c52b64b8e4030fb860cfe5d517b23ce28f670bc80cdab587d0fb8973e0"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.849305 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.865335 4883 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7thqp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.865390 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" podUID="89e1c086-5372-40ce-859d-3eb64bb06012" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.876717 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" podStartSLOduration=195.876703892 podStartE2EDuration="3m15.876703892s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:17.874812282 +0000 UTC m=+224.129710171" watchObservedRunningTime="2026-03-10 09:07:17.876703892 +0000 UTC m=+224.131601782" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.909913 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dlh8b" event={"ID":"7ec510e9-f96b-44da-abec-7d49115d0c83","Type":"ContainerStarted","Data":"b05595c1bee7c0fb7806f0fe9b4fdda6953eaeb7ccc98597a71597db47aa9201"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.919375 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:17 crc kubenswrapper[4883]: E0310 09:07:17.919768 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.419746494 +0000 UTC m=+224.674644382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.955089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"149154dacd692520edab53ab1b6b290faad89add480249f478282a1ccb53512f"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.976504 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" event={"ID":"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9","Type":"ContainerStarted","Data":"0ba4db3001272752a5c0f3ef5737b4e8677404a186b994685a1217279d191778"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.976562 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" event={"ID":"6b8ab70c-32e9-4b51-a5b5-859ff1bb38b9","Type":"ContainerStarted","Data":"12a67716dcb0f3e0591bf818028fe9112722ac345728f4332cd4beae4bdc154b"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.977642 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.995611 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.997170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"5844bc14d9bbcd35e0aab47f240d022209dee44b2d46584f185ba336ec373815"} Mar 10 09:07:17 crc kubenswrapper[4883]: I0310 09:07:17.997197 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" event={"ID":"6e2199dc-f886-4cde-aab8-60f4e4823840","Type":"ContainerStarted","Data":"c65052a843165285629e1d749e51ea6da22f549acfbdc6d218a5d058b7887eed"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.009592 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vzn4" podStartSLOduration=196.009571147 podStartE2EDuration="3m16.009571147s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.008800756 +0000 UTC m=+224.263698646" watchObservedRunningTime="2026-03-10 09:07:18.009571147 +0000 UTC m=+224.264469036" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.014136 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"cd73b445e62902774b57ff3e5a972ad3b51cb219983678fbd1cf0106b444b75e"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.021878 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.022878 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.52286444 +0000 UTC m=+224.777762329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.051772 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" event={"ID":"57fad383-2bee-48b1-b513-32a629c976aa","Type":"ContainerStarted","Data":"ed668a885ff69442249f69c3eacf8edf8c8d73846327f5d249e38422d1a1839c"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.070465 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sw994" podStartSLOduration=196.07045382 podStartE2EDuration="3m16.07045382s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.069903515 +0000 UTC m=+224.324801404" watchObservedRunningTime="2026-03-10 09:07:18.07045382 +0000 UTC m=+224.325351709" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.127803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.128949 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.628932428 +0000 UTC m=+224.883830318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.135529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" event={"ID":"491a1079-cbfa-470e-b91b-84e323ae0c6d","Type":"ContainerStarted","Data":"0cf7088425cc1ceed196102e080795a89f935be52bb32bd59cc200e8a0be03aa"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.158993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" event={"ID":"4210a360-cb3e-4fa8-8fd1-98217c9b00f2","Type":"ContainerStarted","Data":"c2165751bc44a713fdd66d7612b6042159fdbc505f8596dd35eb68f80753c177"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.164141 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ppkhj" podStartSLOduration=196.164129152 podStartE2EDuration="3m16.164129152s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.162714679 +0000 UTC m=+224.417612568" watchObservedRunningTime="2026-03-10 09:07:18.164129152 +0000 UTC m=+224.419027040" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.164768 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sldgp" podStartSLOduration=196.164762744 podStartE2EDuration="3m16.164762744s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.103530021 +0000 UTC m=+224.358427901" watchObservedRunningTime="2026-03-10 09:07:18.164762744 +0000 UTC m=+224.419660633" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174199 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" event={"ID":"3c05291a-8935-4f5e-81c8-4523b3b7e558","Type":"ContainerStarted","Data":"22dbd6a924198c07ada91ae2264df7ef353cbd906df2bb998720c1f7f679d6e3"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174230 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" event={"ID":"3c05291a-8935-4f5e-81c8-4523b3b7e558","Type":"ContainerStarted","Data":"a1e81e20922b5a051ba9d09c2d31d2a47efd773a0bad99d72604af38b8e61429"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.174929 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.195884 4883 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mg7tt container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.195952 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" podUID="3c05291a-8935-4f5e-81c8-4523b3b7e558" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.202692 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" podStartSLOduration=196.202674808 podStartE2EDuration="3m16.202674808s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.200574054 +0000 UTC m=+224.455471933" watchObservedRunningTime="2026-03-10 09:07:18.202674808 +0000 UTC m=+224.457572697" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.204210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"ef5d210ef7d367a4fe800288d31a24b1360c72c641186b395417e2487c44a950"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.204249 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"d0fd1f1ab125bfbee59b8d2d35bde536df9636926ac723fb91c8abad3ee4e6d6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.214674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerStarted","Data":"d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.214706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerStarted","Data":"b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.231066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.232086 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.732070061 +0000 UTC m=+224.986967950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.234543 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" event={"ID":"ad5ba8bd-51b6-42ca-94d3-eaa634ba6270","Type":"ContainerStarted","Data":"f072ddcc741d44b14f25e0e23062c848ba9921be414e2d336c03c8991d7ca771"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.249784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.249817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"ee1e42ffe97556105d0510c897a1238a2dd105fd96a60722e66b11e2fc0634b8"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.250785 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.254661 4883 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ndt59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.254690 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.255422 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" podStartSLOduration=196.255409484 podStartE2EDuration="3m16.255409484s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.233669672 +0000 UTC m=+224.488567561" watchObservedRunningTime="2026-03-10 09:07:18.255409484 +0000 UTC m=+224.510307374" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.257068 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" podStartSLOduration=196.257061103 podStartE2EDuration="3m16.257061103s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.255825838 +0000 UTC m=+224.510723726" watchObservedRunningTime="2026-03-10 09:07:18.257061103 +0000 UTC m=+224.511958992" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.267390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bzfz7" event={"ID":"a47e8310-1d4f-4eba-a49f-adc4d45e7dbb","Type":"ContainerStarted","Data":"1681d12f7f409c28c6346728a6f81e7b5575f9cb9073cc8bfd674a9c57e11468"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.277949 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ldtzk" event={"ID":"ed479632-f556-407c-a8a9-b40379bbf549","Type":"ContainerStarted","Data":"347b506ea4fac471db22f0d72d48d1bd6bb4df665e8acc357b7f7ebec7fc7c86"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.278520 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" podStartSLOduration=196.27850586 podStartE2EDuration="3m16.27850586s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.277402203 +0000 UTC m=+224.532300092" watchObservedRunningTime="2026-03-10 09:07:18.27850586 +0000 UTC m=+224.533403749" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.299151 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nj7x" event={"ID":"459d25fc-b392-4a73-bfce-6250fc05c6e4","Type":"ContainerStarted","Data":"03e9fdec684fe3ef272fd36f8035817f2575a4cceab0125ea9ee75cab1747985"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.312959 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podStartSLOduration=196.312944727 podStartE2EDuration="3m16.312944727s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.308862052 +0000 UTC m=+224.563759931" watchObservedRunningTime="2026-03-10 09:07:18.312944727 +0000 UTC m=+224.567842616" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.333350 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ztqtm" podStartSLOduration=196.333327104 podStartE2EDuration="3m16.333327104s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.328734479 +0000 UTC m=+224.583632367" watchObservedRunningTime="2026-03-10 09:07:18.333327104 +0000 UTC m=+224.588224993" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.335198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.337975 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.837962549 +0000 UTC m=+225.092860438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.359795 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34690: no serving certificate available for the kubelet" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"ea4c74a55ed009b35dd59b065aadde0e8ff5437953fa2f9b2f59287769940fc5"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"32009b107107518d961d11d18e34817a517e07d8c43071edaacf7cfe686e5fa6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.366898 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.367576 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5nj7x" podStartSLOduration=7.367556316 podStartE2EDuration="7.367556316s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.366812465 +0000 UTC m=+224.621710355" watchObservedRunningTime="2026-03-10 09:07:18.367556316 +0000 UTC m=+224.622454205" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.398827 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" event={"ID":"04a7ee07-f81d-4e5a-aeea-b399aa39a31c","Type":"ContainerStarted","Data":"b4a95b1abde7e971c4eb667cf87a07279390514097e272219e2b2b3eed4701c6"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.428192 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9vv9k" event={"ID":"e56425c4-e04a-4313-a946-efc4ddac49ee","Type":"ContainerStarted","Data":"6d3206c5973d2478a28e08951118b229a80aeb90c9cb7aa645bf8556a70cf664"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.436558 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.438127 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:18.938090366 +0000 UTC m=+225.192988245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.452910 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" event={"ID":"cf87b69c-5c1e-4297-82c9-ff39bf48b628","Type":"ContainerStarted","Data":"52c146b14da7df48ffeb2a6dc9c82da3b3a44e3504c5af70d26d827396663483"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.461937 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9vv9k" podStartSLOduration=196.461924321 podStartE2EDuration="3m16.461924321s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.459969901 +0000 UTC m=+224.714867790" watchObservedRunningTime="2026-03-10 09:07:18.461924321 +0000 UTC m=+224.716822210" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.464706 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x6pxw" podStartSLOduration=7.464697531 podStartE2EDuration="7.464697531s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.399712685 +0000 UTC m=+224.654610574" watchObservedRunningTime="2026-03-10 09:07:18.464697531 +0000 UTC m=+224.719595420" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.465749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dqr5n" event={"ID":"8d9ea088-9f19-4839-bfe4-ce54842b04c2","Type":"ContainerStarted","Data":"88fd3b28cd926a2269d5b549a9fe268da29b27e7e2d4ef1b074ba4370fdae271"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.473159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" event={"ID":"828338b4-f6a3-4a38-9596-2556459de30a","Type":"ContainerStarted","Data":"515edad77b98ea6e3d578a7af2269583e805edc35026dddce5be92f595ab31c1"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.495354 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pbmvx" podStartSLOduration=196.495339479 podStartE2EDuration="3m16.495339479s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.494065992 +0000 UTC m=+224.748963881" watchObservedRunningTime="2026-03-10 09:07:18.495339479 +0000 UTC m=+224.750237369" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.507970 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"36b573f303e575f73cd357ced0e8a3184fddb367b14d95484d33c759e93cfcb4"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.508013 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" event={"ID":"f81e48af-a943-4b68-b259-3c0685529d42","Type":"ContainerStarted","Data":"7f9cf91370229057e7414a59d04c86c15b77698e4b193caf4906693923179325"} Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.508027 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.518788 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.543384 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.550667 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.050651277 +0000 UTC m=+225.305549166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.604378 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-x5zf2" podStartSLOduration=196.604351921 podStartE2EDuration="3m16.604351921s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.53366938 +0000 UTC m=+224.788567270" watchObservedRunningTime="2026-03-10 09:07:18.604351921 +0000 UTC m=+224.859249980" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.646569 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.654541 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.655041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.656599 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.156577899 +0000 UTC m=+225.411475788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.672566 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.674314 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.681801 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:18 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:18 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:18 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.681856 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.706128 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" podStartSLOduration=196.706105229 podStartE2EDuration="3m16.706105229s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:18.628749596 +0000 UTC m=+224.883647485" watchObservedRunningTime="2026-03-10 09:07:18.706105229 +0000 UTC m=+224.961003117" Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.757630 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.758065 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.258048013 +0000 UTC m=+225.512945902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.858583 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.858820 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.358791709 +0000 UTC m=+225.613689598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.858860 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.859375 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.35936031 +0000 UTC m=+225.614258199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.860603 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.914881 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.960488 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.960669 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.460644623 +0000 UTC m=+225.715542512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:18 crc kubenswrapper[4883]: I0310 09:07:18.960867 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:18 crc kubenswrapper[4883]: E0310 09:07:18.961281 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.461271584 +0000 UTC m=+225.716169473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.061683 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.062004 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.561990774 +0000 UTC m=+225.816888662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.163124 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.163520 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.663498909 +0000 UTC m=+225.918396788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jv69n" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.189850 4883 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.264319 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: E0310 09:07:19.264863 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 09:07:19.764847666 +0000 UTC m=+226.019745554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.266052 4883 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T09:07:19.189891821Z","Handler":null,"Name":""} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.269272 4883 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.269308 4883 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.365675 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.371723 4883 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.371762 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.410666 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jv69n\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.466694 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.480552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.522094 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x6pxw" event={"ID":"e9409438-97ce-43a6-8a7f-24764925eb53","Type":"ContainerStarted","Data":"f8bdb05cea0bd302fba067dc42e18876c9675c6fca3bd4b1c5448fe1446c2e77"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.525201 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"342796ca3b40259938e8434916bb313e698fc95663ce82432b26ebecce0089be"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.525251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"a5bc9dcf31d8aa2826a579e635e9181b293dce8c155590c95e0a74bb24d8f056"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.527886 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"b1ad82f63d2a99d4c686f8077a3a5a33c60b5bdf754b5d571abea724c8d4a2d9"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.527931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" event={"ID":"d94eaa88-cfd0-497d-804d-922ebd316b33","Type":"ContainerStarted","Data":"09858184ad8944a52a3322affd78a26bf48307d0abb44fec979288789d56074b"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.529704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wpm99" event={"ID":"6fb9cd04-d1cb-446b-9bab-b054c51df85c","Type":"ContainerStarted","Data":"3159005aa4d5ed199258cc991c0ef348bfbd4c940d7be1c132b8c59b0c379eda"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.531890 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" event={"ID":"fbb98199-ce3b-4a19-bc11-a4c55d8e8df2","Type":"ContainerStarted","Data":"554637a79e4a001b48f5e3b6a5c4bdf1ee7b4b7888edec4d101443ca83dc3d39"} Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.534829 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" containerID="cri-o://6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" gracePeriod=30 Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.535162 4883 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ndt59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.535198 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.541789 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mg7tt" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.541961 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d7b5j" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.542064 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pfgjr" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.542207 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7thqp" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.572245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" podStartSLOduration=197.572226203 podStartE2EDuration="3m17.572226203s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:19.556759218 +0000 UTC m=+225.811657108" watchObservedRunningTime="2026-03-10 09:07:19.572226203 +0000 UTC m=+225.827124092" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.633908 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dh2nm" podStartSLOduration=197.633889486 podStartE2EDuration="3m17.633889486s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:19.633732871 +0000 UTC m=+225.888630760" watchObservedRunningTime="2026-03-10 09:07:19.633889486 +0000 UTC m=+225.888787375" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.673649 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:19 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:19 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:19 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.673714 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.713926 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:19 crc kubenswrapper[4883]: I0310 09:07:19.990150 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.021286 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078195 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078418 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.078497 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") pod \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\" (UID: \"313c1e2e-103d-4418-ab8d-9c1e1661f3f7\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.079181 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config" (OuterVolumeSpecName: "config") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.079881 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.084710 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.085060 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc" (OuterVolumeSpecName: "kube-api-access-2qmdc") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "kube-api-access-2qmdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.085998 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "313c1e2e-103d-4418-ab8d-9c1e1661f3f7" (UID: "313c1e2e-103d-4418-ab8d-9c1e1661f3f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.087413 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095072 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.095329 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095348 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.095450 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerName="controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.096133 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.100813 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.106672 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183440 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183511 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183717 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183729 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183740 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183751 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmdc\" (UniqueName: \"kubernetes.io/projected/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-kube-api-access-2qmdc\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.183761 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313c1e2e-103d-4418-ab8d-9c1e1661f3f7-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.268252 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.269172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.272122 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.278932 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284860 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284931 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.284967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.285345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.285938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.311280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"community-operators-tlhr4\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386441 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386531 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.386560 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.464817 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.465798 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.465937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.476044 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.487732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.488434 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.488694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.506361 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"certified-operators-ltgv7\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543127 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerStarted","Data":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543239 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerStarted","Data":"ba9d597cdd4e690659606d934bb4d1fb3e310147327af93f1ac8149f438281d6"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.543281 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.547020 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" event={"ID":"c134b64a-2ccb-4343-8fe3-5202c5a9b8e7","Type":"ContainerStarted","Data":"13258bb7a6eba84c564ff7253911568d80abdde2123bae7297828fc1d9a80d59"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548718 4883 generic.go:334] "Generic (PLEG): container finished" podID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" exitCode=0 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548946 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548969 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerDied","Data":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.548998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pzn5l" event={"ID":"313c1e2e-103d-4418-ab8d-9c1e1661f3f7","Type":"ContainerDied","Data":"4c062e8bdd69b4e921c05bdb270d650295db96c62cdadbfa314f1f418088417e"} Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.549020 4883 scope.go:117] "RemoveContainer" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.552845 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" containerID="cri-o://4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" gracePeriod=30 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.554287 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.563551 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" podStartSLOduration=198.563512095 podStartE2EDuration="3m18.563512095s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:20.558222197 +0000 UTC m=+226.813120086" watchObservedRunningTime="2026-03-10 09:07:20.563512095 +0000 UTC m=+226.818409984" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.577151 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kzsbn" podStartSLOduration=9.577139516 podStartE2EDuration="9.577139516s" podCreationTimestamp="2026-03-10 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:20.574860847 +0000 UTC m=+226.829758737" watchObservedRunningTime="2026-03-10 09:07:20.577139516 +0000 UTC m=+226.832037405" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.577858 4883 scope.go:117] "RemoveContainer" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.580365 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.580910 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": container with ID starting with 6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0 not found: ID does not exist" containerID="6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.580951 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0"} err="failed to get container status \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": rpc error: code = NotFound desc = could not find container \"6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0\": container with ID starting with 6a05cc50f60ebea4c2f11179484d86b2c8daf3ddaeea092fb7e39039a098aee0 not found: ID does not exist" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589347 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.589368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.600020 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.602856 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pzn5l"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.665740 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.667016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.668499 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:20 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:20 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:20 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.668560 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.673000 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.676544 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.691770 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.705089 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.705400 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: W0310 09:07:20.716863 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa724d40_49c8_4d1d_a7e9_5af8f0603e19.slice/crio-bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192 WatchSource:0}: Error finding container bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192: Status 404 returned error can't find the container with id bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192 Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.737806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"community-operators-cwsth\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793765 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.793916 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.801520 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.895739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896039 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896113 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896304 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.896535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.919535 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.924342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"certified-operators-dtqw6\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.937664 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:20 crc kubenswrapper[4883]: E0310 09:07:20.937950 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.937962 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.938084 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="da524055-8528-423f-9ccd-70198a4fbf99" containerName="route-controller-manager" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.940113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944088 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944373 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944506 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.944943 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.945058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.948429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.949536 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.951070 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.952351 4883 ???:1] "http: TLS handshake error from 192.168.126.11:34704: no serving certificate available for the kubelet" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.988852 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997304 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997368 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997466 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") pod \"da524055-8528-423f-9ccd-70198a4fbf99\" (UID: \"da524055-8528-423f-9ccd-70198a4fbf99\") " Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997825 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997892 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.997988 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998109 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998433 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config" (OuterVolumeSpecName: "config") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:20 crc kubenswrapper[4883]: I0310 09:07:20.998460 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca" (OuterVolumeSpecName: "client-ca") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.001178 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz" (OuterVolumeSpecName: "kube-api-access-sdplz") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "kube-api-access-sdplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.001380 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da524055-8528-423f-9ccd-70198a4fbf99" (UID: "da524055-8528-423f-9ccd-70198a4fbf99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.038150 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.040832 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.044921 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98b0611_9639_4987_9e9b_0e1c4695a164.slice/crio-d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219 WatchSource:0}: Error finding container d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219: Status 404 returned error can't find the container with id d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099488 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099545 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099589 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099639 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da524055-8528-423f-9ccd-70198a4fbf99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099656 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdplz\" (UniqueName: \"kubernetes.io/projected/da524055-8528-423f-9ccd-70198a4fbf99-kube-api-access-sdplz\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099669 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.099679 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da524055-8528-423f-9ccd-70198a4fbf99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.100664 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.100849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.101898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.102684 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.117982 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"controller-manager-b79978d66-7m8kr\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.165873 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.179754 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804d1679_c6e1_4594_b067_c41da8ee64ab.slice/crio-7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc WatchSource:0}: Error finding container 7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc: Status 404 returned error can't find the container with id 7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.274691 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.485281 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:21 crc kubenswrapper[4883]: W0310 09:07:21.496747 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod625af2a2_6c46_49c9_90bd_0730adfcf9a8.slice/crio-39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85 WatchSource:0}: Error finding container 39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85: Status 404 returned error can't find the container with id 39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561320 4883 generic.go:334] "Generic (PLEG): container finished" podID="da524055-8528-423f-9ccd-70198a4fbf99" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561377 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerDied","Data":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561408 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" event={"ID":"da524055-8528-423f-9ccd-70198a4fbf99","Type":"ContainerDied","Data":"c26a515962952b0dca378df9d0df683fab142453ca7aa14be72f83e3e38823fb"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561428 4883 scope.go:117] "RemoveContainer" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.561566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.564214 4883 generic.go:334] "Generic (PLEG): container finished" podID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerID="d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.564284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerDied","Data":"d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.566558 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerStarted","Data":"39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569700 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569762 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.569784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerStarted","Data":"d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572164 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572550 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.572603 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerStarted","Data":"7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575456 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.575972 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerStarted","Data":"f5c87addac3f89a4858d25eb6fa3c57863872b10777952494e3f153096638f60"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581636 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" exitCode=0 Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581702 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.581722 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192"} Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.598975 4883 scope.go:117] "RemoveContainer" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: E0310 09:07:21.600788 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": container with ID starting with 4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8 not found: ID does not exist" containerID="4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.600825 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8"} err="failed to get container status \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": rpc error: code = NotFound desc = could not find container \"4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8\": container with ID starting with 4e01e87be9a0ecdb1b84d24ed6211ff86cf1dbb50a2a0bb5e931a092d5e2fce8 not found: ID does not exist" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.638286 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.641286 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76t2f"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.663770 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:21 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:21 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:21 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.663828 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.888937 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.889871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.891687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.893681 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:07:21 crc kubenswrapper[4883]: I0310 09:07:21.899039 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.015564 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.015685 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.046724 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.047424 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.048719 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.049435 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.059360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.086072 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313c1e2e-103d-4418-ab8d-9c1e1661f3f7" path="/var/lib/kubelet/pods/313c1e2e-103d-4418-ab8d-9c1e1661f3f7/volumes" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.086762 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da524055-8528-423f-9ccd-70198a4fbf99" path="/var/lib/kubelet/pods/da524055-8528-423f-9ccd-70198a4fbf99/volumes" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117088 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117159 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117188 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117244 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.117501 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.135718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.203090 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219239 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.219370 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.233715 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.274688 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.277941 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.279140 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.281085 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.361670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423002 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423045 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.423071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525240 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525296 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525915 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.525946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.551301 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"redhat-marketplace-2h5dv\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.589926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerStarted","Data":"7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e"} Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.590148 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.594609 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.595807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.608318 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podStartSLOduration=3.60829939 podStartE2EDuration="3.60829939s" podCreationTimestamp="2026-03-10 09:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:22.603294828 +0000 UTC m=+228.858192718" watchObservedRunningTime="2026-03-10 09:07:22.60829939 +0000 UTC m=+228.863197279" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664744 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664752 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:22 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:22 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:22 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.664927 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.665888 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.678163 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.729831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.730243 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.730368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832056 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832107 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.832835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.833110 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.848404 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"redhat-marketplace-p22dp\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.937081 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.938243 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.943884 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947068 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947310 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947398 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947551 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.947544 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.953858 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:22 crc kubenswrapper[4883]: I0310 09:07:22.984011 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037322 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037402 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.037421 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139498 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139572 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139610 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.139634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.140776 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.141048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.145574 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.156359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"route-controller-manager-59c5db475d-pf9vl\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.261337 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.273722 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.276038 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.279567 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.282755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.356750 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458250 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458493 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.458953 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.459112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.473359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"redhat-operators-vhnvt\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.592122 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.663964 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:23 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:23 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:23 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.664033 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.669549 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.671297 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.684227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.712013 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-69msk" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762270 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.762387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863237 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863331 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.863365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.864524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.864556 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.878998 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"redhat-operators-j5rwl\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.976530 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.976611 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.978848 4883 patch_prober.go:28] interesting pod/console-f9d7485db-nbvf4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.978907 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nbvf4" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 09:07:23 crc kubenswrapper[4883]: I0310 09:07:23.989355 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.156295 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.156349 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.162651 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.340013 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475616 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475705 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.475801 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") pod \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\" (UID: \"0be14f8e-b9d8-4058-9be3-cdc61ce88626\") " Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.476833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume" (OuterVolumeSpecName: "config-volume") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.481098 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4" (OuterVolumeSpecName: "kube-api-access-7txw4") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "kube-api-access-7txw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.487529 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0be14f8e-b9d8-4058-9be3-cdc61ce88626" (UID: "0be14f8e-b9d8-4058-9be3-cdc61ce88626"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577820 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be14f8e-b9d8-4058-9be3-cdc61ce88626-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577851 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be14f8e-b9d8-4058-9be3-cdc61ce88626-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.577860 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7txw4\" (UniqueName: \"kubernetes.io/projected/0be14f8e-b9d8-4058-9be3-cdc61ce88626-kube-api-access-7txw4\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" event={"ID":"0be14f8e-b9d8-4058-9be3-cdc61ce88626","Type":"ContainerDied","Data":"b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb"} Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611794 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.611811 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2de766c7f3e1df780306fb20dfcae02ace3d5080579f4d7e5fa2a3d5480fbbb" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.615674 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-h5tmh" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.663664 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.668538 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:24 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:24 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:24 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:24 crc kubenswrapper[4883]: I0310 09:07:24.668595 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:25 crc kubenswrapper[4883]: I0310 09:07:25.663013 4883 patch_prober.go:28] interesting pod/router-default-5444994796-9vv9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 09:07:25 crc kubenswrapper[4883]: [-]has-synced failed: reason withheld Mar 10 09:07:25 crc kubenswrapper[4883]: [+]process-running ok Mar 10 09:07:25 crc kubenswrapper[4883]: healthz check failed Mar 10 09:07:25 crc kubenswrapper[4883]: I0310 09:07:25.663322 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9vv9k" podUID="e56425c4-e04a-4313-a946-efc4ddac49ee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.031714 4883 ???:1] "http: TLS handshake error from 192.168.126.11:53968: no serving certificate available for the kubelet" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.133031 4883 ???:1] "http: TLS handshake error from 192.168.126.11:53980: no serving certificate available for the kubelet" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.662493 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:26 crc kubenswrapper[4883]: I0310 09:07:26.665315 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9vv9k" Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.143202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.165582 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504544f8_69a4_4562_87d5_fa61335ea052.slice/crio-a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d WatchSource:0}: Error finding container a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d: Status 404 returned error can't find the container with id a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.187731 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.300691 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.318940 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cbe9069_9970_4e7d_a2ec_d563c6b46a1c.slice/crio-04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac WatchSource:0}: Error finding container 04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac: Status 404 returned error can't find the container with id 04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.560866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.621010 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.627332 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.633659 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.645571 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf1e38226_07ed_488a_b501_b3aeacb94bc6.slice/crio-ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58 WatchSource:0}: Error finding container ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58: Status 404 returned error can't find the container with id ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58 Mar 10 09:07:27 crc kubenswrapper[4883]: W0310 09:07:27.646914 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e9f1e6_cc9a_45a0_9a03_e3b1526b5783.slice/crio-2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab WatchSource:0}: Error finding container 2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab: Status 404 returned error can't find the container with id 2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656426 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656540 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.656613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661293 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661356 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.661385 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerStarted","Data":"7617eec4e807a31ae8dae401f57247ee0d7df593c7506b5c96f9dc3caf16e27a"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.664843 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerStarted","Data":"4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667314 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" exitCode=0 Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.667446 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerStarted","Data":"a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d"} Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.735200 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" podStartSLOduration=76.149756017 podStartE2EDuration="1m27.735183725s" podCreationTimestamp="2026-03-10 09:06:00 +0000 UTC" firstStartedPulling="2026-03-10 09:07:15.451600344 +0000 UTC m=+221.706498232" lastFinishedPulling="2026-03-10 09:07:27.037028051 +0000 UTC m=+233.291925940" observedRunningTime="2026-03-10 09:07:27.730421801 +0000 UTC m=+233.985319689" watchObservedRunningTime="2026-03-10 09:07:27.735183725 +0000 UTC m=+233.990081613" Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.937730 4883 csr.go:261] certificate signing request csr-hb9cx is approved, waiting to be issued Mar 10 09:07:27 crc kubenswrapper[4883]: I0310 09:07:27.937888 4883 csr.go:257] certificate signing request csr-hb9cx is issued Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.689746 4883 generic.go:334] "Generic (PLEG): container finished" podID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerID="31f6e5116ed55f5b8d1843edebd1e3733bcdea144efa8c0f68bdfcaf678a7f01" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.690091 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerDied","Data":"31f6e5116ed55f5b8d1843edebd1e3733bcdea144efa8c0f68bdfcaf678a7f01"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.690176 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerStarted","Data":"ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.694893 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerStarted","Data":"0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.694933 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerStarted","Data":"2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.695344 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697224 4883 generic.go:334] "Generic (PLEG): container finished" podID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerID="21f3fa430d885af5b7d8003d9010231094e6b0abcc772af52da4a0423d3fc2c7" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697316 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerDied","Data":"21f3fa430d885af5b7d8003d9010231094e6b0abcc772af52da4a0423d3fc2c7"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.697364 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerStarted","Data":"2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.700449 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701183 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.701275 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerStarted","Data":"7cd4d72ef0244e1c6f3955303b46c7d75041bd13eacfaf569a15ddb645d99b32"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.704925 4883 generic.go:334] "Generic (PLEG): container finished" podID="632d4971-be4e-4939-a46a-42604b182436" containerID="4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5" exitCode=0 Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.704965 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerDied","Data":"4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5"} Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.744900 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" podStartSLOduration=9.744745208 podStartE2EDuration="9.744745208s" podCreationTimestamp="2026-03-10 09:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:28.740701585 +0000 UTC m=+234.995599495" watchObservedRunningTime="2026-03-10 09:07:28.744745208 +0000 UTC m=+234.999643096" Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.940437 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-14 05:43:17.032041194 +0000 UTC Mar 10 09:07:28 crc kubenswrapper[4883]: I0310 09:07:28.940494 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6692h35m48.091571161s for next certificate rotation Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.862904 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x6pxw" Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.941570 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 12:39:00.080315664 +0000 UTC Mar 10 09:07:29 crc kubenswrapper[4883]: I0310 09:07:29.941605 4883 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6435h31m30.138712163s for next certificate rotation Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.864300 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.984788 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") pod \"632d4971-be4e-4939-a46a-42604b182436\" (UID: \"632d4971-be4e-4939-a46a-42604b182436\") " Mar 10 09:07:30 crc kubenswrapper[4883]: I0310 09:07:30.990774 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd" (OuterVolumeSpecName: "kube-api-access-wkbdd") pod "632d4971-be4e-4939-a46a-42604b182436" (UID: "632d4971-be4e-4939-a46a-42604b182436"). InnerVolumeSpecName "kube-api-access-wkbdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.086708 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkbdd\" (UniqueName: \"kubernetes.io/projected/632d4971-be4e-4939-a46a-42604b182436-kube-api-access-wkbdd\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.191756 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.195703 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.288955 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") pod \"83795b7e-47fe-45a0-85fe-63e8b880ddae\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289011 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") pod \"f1e38226-07ed-488a-b501-b3aeacb94bc6\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "83795b7e-47fe-45a0-85fe-63e8b880ddae" (UID: "83795b7e-47fe-45a0-85fe-63e8b880ddae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289231 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") pod \"83795b7e-47fe-45a0-85fe-63e8b880ddae\" (UID: \"83795b7e-47fe-45a0-85fe-63e8b880ddae\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289283 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") pod \"f1e38226-07ed-488a-b501-b3aeacb94bc6\" (UID: \"f1e38226-07ed-488a-b501-b3aeacb94bc6\") " Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289437 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f1e38226-07ed-488a-b501-b3aeacb94bc6" (UID: "f1e38226-07ed-488a-b501-b3aeacb94bc6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289769 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83795b7e-47fe-45a0-85fe-63e8b880ddae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.289788 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1e38226-07ed-488a-b501-b3aeacb94bc6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.294241 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "83795b7e-47fe-45a0-85fe-63e8b880ddae" (UID: "83795b7e-47fe-45a0-85fe-63e8b880ddae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.294966 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f1e38226-07ed-488a-b501-b3aeacb94bc6" (UID: "f1e38226-07ed-488a-b501-b3aeacb94bc6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.391200 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/83795b7e-47fe-45a0-85fe-63e8b880ddae-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.391226 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f1e38226-07ed-488a-b501-b3aeacb94bc6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735221 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735309 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f1e38226-07ed-488a-b501-b3aeacb94bc6","Type":"ContainerDied","Data":"ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.735359 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee77aab60e91d1804cd4e1ed82a9bd9cd852e5b8963eeb759c6fb7df9b3cfe58" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"83795b7e-47fe-45a0-85fe-63e8b880ddae","Type":"ContainerDied","Data":"2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741142 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6592679e3f9885f3d9a38472029972a6fc2dbf38719c4a43a188dbe997b60d" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.741166 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745238 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" event={"ID":"632d4971-be4e-4939-a46a-42604b182436","Type":"ContainerDied","Data":"de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9"} Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745286 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de61b8fd98c13cb0710e4560c66bd5b5056787cf78c03765e45a9dc01a3d0bf9" Mar 10 09:07:31 crc kubenswrapper[4883]: I0310 09:07:31.745360 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-jp7d9" Mar 10 09:07:33 crc kubenswrapper[4883]: I0310 09:07:33.981930 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:33 crc kubenswrapper[4883]: I0310 09:07:33.988942 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.534020 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.535272 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" containerID="cri-o://7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" gracePeriod=30 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.548887 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.549125 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" containerID="cri-o://0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" gracePeriod=30 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.800709 4883 generic.go:334] "Generic (PLEG): container finished" podID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerID="7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" exitCode=0 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.800808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerDied","Data":"7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e"} Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.803415 4883 generic.go:334] "Generic (PLEG): container finished" podID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerID="0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" exitCode=0 Mar 10 09:07:38 crc kubenswrapper[4883]: I0310 09:07:38.803527 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerDied","Data":"0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1"} Mar 10 09:07:39 crc kubenswrapper[4883]: I0310 09:07:39.719715 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.275926 4883 patch_prober.go:28] interesting pod/controller-manager-b79978d66-7m8kr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.276380 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.318373 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340062 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340298 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340312 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340321 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340327 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340337 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340342 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340353 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340358 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: E0310 09:07:41.340365 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340371 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" containerName="route-controller-manager" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340494 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" containerName="collect-profiles" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340504 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="632d4971-be4e-4939-a46a-42604b182436" containerName="oc" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340511 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="83795b7e-47fe-45a0-85fe-63e8b880ddae" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340520 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e38226-07ed-488a-b501-b3aeacb94bc6" containerName="pruner" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.340986 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.348461 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471566 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.471849 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") pod \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\" (UID: \"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783\") " Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472059 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472339 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca" (OuterVolumeSpecName: "client-ca") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.472426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config" (OuterVolumeSpecName: "config") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.477023 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv" (OuterVolumeSpecName: "kube-api-access-58qdv") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "kube-api-access-58qdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.478248 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" (UID: "59e9f1e6-cc9a-45a0-9a03-e3b1526b5783"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.573186 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574093 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574909 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.574964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.575875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576637 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576678 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576757 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.576854 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qdv\" (UniqueName: \"kubernetes.io/projected/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783-kube-api-access-58qdv\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.580496 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.589609 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"route-controller-manager-7b9ddc79f6-tq6gx\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.661062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831261 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" event={"ID":"59e9f1e6-cc9a-45a0-9a03-e3b1526b5783","Type":"ContainerDied","Data":"2d8f88178da3a29c5d0e4f5892aa2df9d27ee9b9f94aacf88d6eef5d1defa2ab"} Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831585 4883 scope.go:117] "RemoveContainer" containerID="0313c94c278ef3dce737ae5cde52e50a38c39b491a4d01a221eda83cbceca9d1" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.831314 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl" Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.856710 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:41 crc kubenswrapper[4883]: I0310 09:07:41.859639 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59c5db475d-pf9vl"] Mar 10 09:07:42 crc kubenswrapper[4883]: I0310 09:07:42.102316 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e9f1e6-cc9a-45a0-9a03-e3b1526b5783" path="/var/lib/kubelet/pods/59e9f1e6-cc9a-45a0-9a03-e3b1526b5783/volumes" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.449724 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.450122 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.667950 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697183 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:47 crc kubenswrapper[4883]: E0310 09:07:47.697523 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697545 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.697669 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" containerName="controller-manager" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.698147 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.700794 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.767987 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.768098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.768150 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") pod \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\" (UID: \"625af2a2-6c46-49c9-90bd-0730adfcf9a8\") " Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.769546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.769574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.770509 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config" (OuterVolumeSpecName: "config") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.776102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2" (OuterVolumeSpecName: "kube-api-access-zbwc2") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "kube-api-access-zbwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.776254 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "625af2a2-6c46-49c9-90bd-0730adfcf9a8" (UID: "625af2a2-6c46-49c9-90bd-0730adfcf9a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.876433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877194 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877250 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877262 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/625af2a2-6c46-49c9-90bd-0730adfcf9a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877272 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877285 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwc2\" (UniqueName: \"kubernetes.io/projected/625af2a2-6c46-49c9-90bd-0730adfcf9a8-kube-api-access-zbwc2\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.877294 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/625af2a2-6c46-49c9-90bd-0730adfcf9a8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.919865 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.924922 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926735 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" event={"ID":"625af2a2-6c46-49c9-90bd-0730adfcf9a8","Type":"ContainerDied","Data":"39ca8bd84b869eeb8c74d1ff18bd08cab3d284caa5f72a4f7529690086692a85"} Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926775 4883 scope.go:117] "RemoveContainer" containerID="7bce35fe57df60a11c326c0ddd8ec6eab1f979bc0ca526968c4c7e1e91df682e" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.926896 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b79978d66-7m8kr" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978536 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978631 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.978717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.979988 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.981955 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.982665 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.985933 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.989792 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b79978d66-7m8kr"] Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.996079 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:47 crc kubenswrapper[4883]: I0310 09:07:47.999579 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"controller-manager-546cd7f689-ltpnp\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.030761 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.097177 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625af2a2-6c46-49c9-90bd-0730adfcf9a8" path="/var/lib/kubelet/pods/625af2a2-6c46-49c9-90bd-0730adfcf9a8/volumes" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.109618 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.290790 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:48 crc kubenswrapper[4883]: W0310 09:07:48.423516 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219a69fb_a146_4034_b934_3f1f8f81b338.slice/crio-7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667 WatchSource:0}: Error finding container 7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667: Status 404 returned error can't find the container with id 7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.942707 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.942874 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.947761 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerStarted","Data":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.947808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerStarted","Data":"545697c18b61099ee4c8abb7b405fe27097c321bcbe2376ab05f34b5b5edc3c0"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.949058 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.954797 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.954861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.958010 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.958062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.960536 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.963633 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.963695 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969275 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerStarted","Data":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerStarted","Data":"7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.969726 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.972106 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.972268 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975236 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.975986 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.981262 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.981406 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98"} Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.984416 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" exitCode=0 Mar 10 09:07:48 crc kubenswrapper[4883]: I0310 09:07:48.984459 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.009915 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" podStartSLOduration=11.009885949 podStartE2EDuration="11.009885949s" podCreationTimestamp="2026-03-10 09:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:49.007993466 +0000 UTC m=+255.262891355" watchObservedRunningTime="2026-03-10 09:07:49.009885949 +0000 UTC m=+255.264783858" Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.048639 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" podStartSLOduration=11.048626512 podStartE2EDuration="11.048626512s" podCreationTimestamp="2026-03-10 09:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:07:49.048589362 +0000 UTC m=+255.303487250" watchObservedRunningTime="2026-03-10 09:07:49.048626512 +0000 UTC m=+255.303524401" Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.993746 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerStarted","Data":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.997615 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerStarted","Data":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} Mar 10 09:07:49 crc kubenswrapper[4883]: I0310 09:07:49.999884 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerStarted","Data":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.002498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerStarted","Data":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.004751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerStarted","Data":"a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.006692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerStarted","Data":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.008493 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerStarted","Data":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.011108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerStarted","Data":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.020265 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhnvt" podStartSLOduration=5.161998165 podStartE2EDuration="27.02025427s" podCreationTimestamp="2026-03-10 09:07:23 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.682707786 +0000 UTC m=+233.937605675" lastFinishedPulling="2026-03-10 09:07:49.54096389 +0000 UTC m=+255.795861780" observedRunningTime="2026-03-10 09:07:50.017622846 +0000 UTC m=+256.272520735" watchObservedRunningTime="2026-03-10 09:07:50.02025427 +0000 UTC m=+256.275152158" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.056239 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ltgv7" podStartSLOduration=2.093811966 podStartE2EDuration="30.056227554s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599209394 +0000 UTC m=+227.854107284" lastFinishedPulling="2026-03-10 09:07:49.561624982 +0000 UTC m=+255.816522872" observedRunningTime="2026-03-10 09:07:50.055107436 +0000 UTC m=+256.310005324" watchObservedRunningTime="2026-03-10 09:07:50.056227554 +0000 UTC m=+256.311125443" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.056753 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p22dp" podStartSLOduration=6.241165868 podStartE2EDuration="28.056748866s" podCreationTimestamp="2026-03-10 09:07:22 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.666673374 +0000 UTC m=+233.921571273" lastFinishedPulling="2026-03-10 09:07:49.482256382 +0000 UTC m=+255.737154271" observedRunningTime="2026-03-10 09:07:50.039823336 +0000 UTC m=+256.294721225" watchObservedRunningTime="2026-03-10 09:07:50.056748866 +0000 UTC m=+256.311646755" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.075309 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dtqw6" podStartSLOduration=2.121823455 podStartE2EDuration="30.075293994s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599645405 +0000 UTC m=+227.854543294" lastFinishedPulling="2026-03-10 09:07:49.553115945 +0000 UTC m=+255.808013833" observedRunningTime="2026-03-10 09:07:50.071378504 +0000 UTC m=+256.326276392" watchObservedRunningTime="2026-03-10 09:07:50.075293994 +0000 UTC m=+256.330191883" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.088930 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cwsth" podStartSLOduration=2.102009377 podStartE2EDuration="30.088916787s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.59911154 +0000 UTC m=+227.854009429" lastFinishedPulling="2026-03-10 09:07:49.58601895 +0000 UTC m=+255.840916839" observedRunningTime="2026-03-10 09:07:50.085219097 +0000 UTC m=+256.340116986" watchObservedRunningTime="2026-03-10 09:07:50.088916787 +0000 UTC m=+256.343814667" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.102879 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tlhr4" podStartSLOduration=2.227598451 podStartE2EDuration="30.102859903s" podCreationTimestamp="2026-03-10 09:07:20 +0000 UTC" firstStartedPulling="2026-03-10 09:07:21.599576486 +0000 UTC m=+227.854474375" lastFinishedPulling="2026-03-10 09:07:49.474837937 +0000 UTC m=+255.729735827" observedRunningTime="2026-03-10 09:07:50.100542612 +0000 UTC m=+256.355440501" watchObservedRunningTime="2026-03-10 09:07:50.102859903 +0000 UTC m=+256.357757792" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.133940 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j5rwl" podStartSLOduration=5.249434762 podStartE2EDuration="27.133926061s" podCreationTimestamp="2026-03-10 09:07:23 +0000 UTC" firstStartedPulling="2026-03-10 09:07:27.693413758 +0000 UTC m=+233.948311647" lastFinishedPulling="2026-03-10 09:07:49.577905057 +0000 UTC m=+255.832802946" observedRunningTime="2026-03-10 09:07:50.117624736 +0000 UTC m=+256.372522626" watchObservedRunningTime="2026-03-10 09:07:50.133926061 +0000 UTC m=+256.388823951" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.466460 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.466536 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.581487 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.581541 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.802373 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.802729 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.989462 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:50 crc kubenswrapper[4883]: I0310 09:07:50.989809 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.552714 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tlhr4" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.612605 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-ltgv7" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:51 crc kubenswrapper[4883]: I0310 09:07:51.832062 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-cwsth" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:51 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:51 crc kubenswrapper[4883]: > Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.020210 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dtqw6" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:52 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:52 crc kubenswrapper[4883]: > Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.596029 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.596192 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.635318 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.652637 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2h5dv" podStartSLOduration=11.992897582 podStartE2EDuration="30.652621734s" podCreationTimestamp="2026-03-10 09:07:22 +0000 UTC" firstStartedPulling="2026-03-10 09:07:30.83190895 +0000 UTC m=+237.086806839" lastFinishedPulling="2026-03-10 09:07:49.491633102 +0000 UTC m=+255.746530991" observedRunningTime="2026-03-10 09:07:50.13617275 +0000 UTC m=+256.391070639" watchObservedRunningTime="2026-03-10 09:07:52.652621734 +0000 UTC m=+258.907519623" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.984861 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:52 crc kubenswrapper[4883]: I0310 09:07:52.984993 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.017355 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.592621 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.592677 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.990229 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:53 crc kubenswrapper[4883]: I0310 09:07:53.990286 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.068412 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.068533 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.470905 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6zbd" Mar 10 09:07:54 crc kubenswrapper[4883]: I0310 09:07:54.623549 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vhnvt" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:54 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:54 crc kubenswrapper[4883]: > Mar 10 09:07:55 crc kubenswrapper[4883]: I0310 09:07:55.021746 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j5rwl" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" probeResult="failure" output=< Mar 10 09:07:55 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:07:55 crc kubenswrapper[4883]: > Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.549001 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.549553 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p22dp" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" containerID="cri-o://38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" gracePeriod=2 Mar 10 09:07:56 crc kubenswrapper[4883]: I0310 09:07:56.943754 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.012585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.013365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities" (OuterVolumeSpecName: "utilities") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049013 4883 generic.go:334] "Generic (PLEG): container finished" podID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" exitCode=0 Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049040 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p22dp" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049052 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049084 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p22dp" event={"ID":"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c","Type":"ContainerDied","Data":"04c01d96f1c8b1e9164cb1cc2f2e5455c16dd9c69f66112c9edab2362be5bcac"} Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.049114 4883 scope.go:117] "RemoveContainer" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.066378 4883 scope.go:117] "RemoveContainer" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.078714 4883 scope.go:117] "RemoveContainer" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.097598 4883 scope.go:117] "RemoveContainer" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.097986 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": container with ID starting with 38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff not found: ID does not exist" containerID="38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098035 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff"} err="failed to get container status \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": rpc error: code = NotFound desc = could not find container \"38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff\": container with ID starting with 38041fae57fef9d5e30aaf19b7600ba92adb0463406e261af4e3b02cba8189ff not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098064 4883 scope.go:117] "RemoveContainer" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.098452 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": container with ID starting with 168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c not found: ID does not exist" containerID="168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098500 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c"} err="failed to get container status \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": rpc error: code = NotFound desc = could not find container \"168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c\": container with ID starting with 168b10f3b2652a0f25805c55151e8fdd125e4aafb97e4e2611caa8a474aa936c not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098517 4883 scope.go:117] "RemoveContainer" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.098805 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": container with ID starting with b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001 not found: ID does not exist" containerID="b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.098826 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001"} err="failed to get container status \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": rpc error: code = NotFound desc = could not find container \"b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001\": container with ID starting with b9894543d80049eff60d4b0d92a37ad61ce45a4d9df9288a39c03ec33947b001 not found: ID does not exist" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113440 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113530 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") pod \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\" (UID: \"9cbe9069-9970-4e7d-a2ec-d563c6b46a1c\") " Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.113914 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.120849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r" (OuterVolumeSpecName: "kube-api-access-lqv7r") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "kube-api-access-lqv7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.140061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" (UID: "9cbe9069-9970-4e7d-a2ec-d563c6b46a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.214892 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqv7r\" (UniqueName: \"kubernetes.io/projected/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-kube-api-access-lqv7r\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.215196 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.295292 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296008 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296031 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296057 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-content" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296064 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-content" Mar 10 09:07:57 crc kubenswrapper[4883]: E0310 09:07:57.296086 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-utilities" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296096 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="extract-utilities" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.296359 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" containerName="registry-server" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.297304 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.301172 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.301494 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.319008 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.321536 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.321587 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.375535 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.376349 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p22dp"] Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.423971 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.424026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.424129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.443120 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:57 crc kubenswrapper[4883]: I0310 09:07:57.630576 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.003289 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.058308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerStarted","Data":"617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9"} Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.086594 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbe9069-9970-4e7d-a2ec-d563c6b46a1c" path="/var/lib/kubelet/pods/9cbe9069-9970-4e7d-a2ec-d563c6b46a1c/volumes" Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.492641 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.493112 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" containerID="cri-o://a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" gracePeriod=30 Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.520461 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.520748 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" containerID="cri-o://a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" gracePeriod=30 Mar 10 09:07:58 crc kubenswrapper[4883]: I0310 09:07:58.953194 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.011466 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.044168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.044950 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config" (OuterVolumeSpecName: "config") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066178 4883 generic.go:334] "Generic (PLEG): container finished" podID="219a69fb-a146-4034-b934-3f1f8f81b338" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066246 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerDied","Data":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066280 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" event={"ID":"219a69fb-a146-4034-b934-3f1f8f81b338","Type":"ContainerDied","Data":"7e450ce69bf6209d6d7b73d0ae6cf77e018fd9c7d10014b6d95fe66587dd6667"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066298 4883 scope.go:117] "RemoveContainer" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.066421 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-546cd7f689-ltpnp" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.070720 4883 generic.go:334] "Generic (PLEG): container finished" podID="68a11b02-0067-46ed-84fe-c764e88b2810" containerID="5ea3a1acee75ae8a56c0a060d638503d84669497c3c2aebe6a60cd9076f24524" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.070799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerDied","Data":"5ea3a1acee75ae8a56c0a060d638503d84669497c3c2aebe6a60cd9076f24524"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072232 4883 generic.go:334] "Generic (PLEG): container finished" podID="48a61604-9f49-4b8f-8534-707af35c4667" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" exitCode=0 Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072263 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerDied","Data":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072454 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" event={"ID":"48a61604-9f49-4b8f-8534-707af35c4667","Type":"ContainerDied","Data":"545697c18b61099ee4c8abb7b405fe27097c321bcbe2376ab05f34b5b5edc3c0"} Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.072268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083009 4883 scope.go:117] "RemoveContainer" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.083305 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": container with ID starting with a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6 not found: ID does not exist" containerID="a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083341 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6"} err="failed to get container status \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": rpc error: code = NotFound desc = could not find container \"a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6\": container with ID starting with a506ee877d7143210676483b4591c480af31e8134b5f6cc784632d72ec26f8f6 not found: ID does not exist" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.083363 4883 scope.go:117] "RemoveContainer" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.101767 4883 scope.go:117] "RemoveContainer" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.102055 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": container with ID starting with a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011 not found: ID does not exist" containerID="a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.102087 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011"} err="failed to get container status \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": rpc error: code = NotFound desc = could not find container \"a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011\": container with ID starting with a2840a493a204099ae81d196d769738264b8a57ffe4a390b74040eef999c2011 not found: ID does not exist" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.145527 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146504 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca" (OuterVolumeSpecName: "client-ca") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146582 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.146606 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147114 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147152 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca" (OuterVolumeSpecName: "client-ca") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147254 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") pod \"48a61604-9f49-4b8f-8534-707af35c4667\" (UID: \"48a61604-9f49-4b8f-8534-707af35c4667\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147289 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") pod \"219a69fb-a146-4034-b934-3f1f8f81b338\" (UID: \"219a69fb-a146-4034-b934-3f1f8f81b338\") " Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config" (OuterVolumeSpecName: "config") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147753 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147776 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147785 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147796 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a69fb-a146-4034-b934-3f1f8f81b338-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.147806 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a61604-9f49-4b8f-8534-707af35c4667-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152225 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x" (OuterVolumeSpecName: "kube-api-access-fct8x") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "kube-api-access-fct8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152336 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2" (OuterVolumeSpecName: "kube-api-access-nqsg2") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "kube-api-access-nqsg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.152423 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48a61604-9f49-4b8f-8534-707af35c4667" (UID: "48a61604-9f49-4b8f-8534-707af35c4667"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.171095 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "219a69fb-a146-4034-b934-3f1f8f81b338" (UID: "219a69fb-a146-4034-b934-3f1f8f81b338"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249451 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a61604-9f49-4b8f-8534-707af35c4667-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249494 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fct8x\" (UniqueName: \"kubernetes.io/projected/219a69fb-a146-4034-b934-3f1f8f81b338-kube-api-access-fct8x\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249506 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqsg2\" (UniqueName: \"kubernetes.io/projected/48a61604-9f49-4b8f-8534-707af35c4667-kube-api-access-nqsg2\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.249515 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a69fb-a146-4034-b934-3f1f8f81b338-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.392317 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.396116 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-546cd7f689-ltpnp"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.404630 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.407262 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b9ddc79f6-tq6gx"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962375 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.962667 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962681 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: E0310 09:07:59.962710 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962827 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a61604-9f49-4b8f-8534-707af35c4667" containerName="route-controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.962844 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" containerName="controller-manager" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.963317 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.964966 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965432 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965661 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.965920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.966239 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.966834 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.967128 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.967740 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.969792 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.969989 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970215 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970815 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.970921 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.971174 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.973104 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.974784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:07:59 crc kubenswrapper[4883]: I0310 09:07:59.976739 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066779 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066837 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066895 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.066946 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.067076 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.067125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.086263 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219a69fb-a146-4034-b934-3f1f8f81b338" path="/var/lib/kubelet/pods/219a69fb-a146-4034-b934-3f1f8f81b338/volumes" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.086786 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a61604-9f49-4b8f-8534-707af35c4667" path="/var/lib/kubelet/pods/48a61604-9f49-4b8f-8534-707af35c4667/volumes" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.132491 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.133163 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135206 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135355 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.135462 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.139493 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168426 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168582 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168627 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168728 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.168755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170297 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170299 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170352 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.170340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.172659 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.173226 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.182601 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"route-controller-manager-6b5b795987-9m4tj\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.184102 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"controller-manager-8dd4584bf-swxjl\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.270026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.280042 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.287468 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.289206 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"auto-csr-approver-29552228-kn7mm\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.312840 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.370951 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") pod \"68a11b02-0067-46ed-84fe-c764e88b2810\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371031 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") pod \"68a11b02-0067-46ed-84fe-c764e88b2810\" (UID: \"68a11b02-0067-46ed-84fe-c764e88b2810\") " Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371072 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "68a11b02-0067-46ed-84fe-c764e88b2810" (UID: "68a11b02-0067-46ed-84fe-c764e88b2810"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.371315 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a11b02-0067-46ed-84fe-c764e88b2810-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.375426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "68a11b02-0067-46ed-84fe-c764e88b2810" (UID: "68a11b02-0067-46ed-84fe-c764e88b2810"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.443949 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.472015 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68a11b02-0067-46ed-84fe-c764e88b2810-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.497555 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.534590 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.598633 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:08:00 crc kubenswrapper[4883]: W0310 09:08:00.603368 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07643a3_c0a9_4770_a08e_ab4fb32dfe8e.slice/crio-f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a WatchSource:0}: Error finding container f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a: Status 404 returned error can't find the container with id f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.620633 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.657331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:00 crc kubenswrapper[4883]: W0310 09:08:00.659361 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1f59ff_042d_4e9f_a4d9_06a1d99492cc.slice/crio-199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd WatchSource:0}: Error finding container 199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd: Status 404 returned error can't find the container with id 199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.662860 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.700636 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.836261 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:00 crc kubenswrapper[4883]: I0310 09:08:00.875114 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.045178 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.077116 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.088117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerStarted","Data":"f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090512 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerStarted","Data":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090550 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerStarted","Data":"f055423efedb6df834dff4c56e0e74a59306fd98bac74cffa507280ee1ff3f83"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.090666 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092247 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"68a11b02-0067-46ed-84fe-c764e88b2810","Type":"ContainerDied","Data":"617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092278 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="617b5bb81242e03395057fe95d456d123903cf5a9eb30f13bfcebeef10f8dca9" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.092330 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.094431 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.095222 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerStarted","Data":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.095250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerStarted","Data":"199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd"} Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.108223 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" podStartSLOduration=3.1082113 podStartE2EDuration="3.1082113s" podCreationTimestamp="2026-03-10 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:01.105947075 +0000 UTC m=+267.360844965" watchObservedRunningTime="2026-03-10 09:08:01.1082113 +0000 UTC m=+267.363109189" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.956061 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" podStartSLOduration=3.956041359 podStartE2EDuration="3.956041359s" podCreationTimestamp="2026-03-10 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:01.129093663 +0000 UTC m=+267.383991552" watchObservedRunningTime="2026-03-10 09:08:01.956041359 +0000 UTC m=+268.210939248" Mar 10 09:08:01 crc kubenswrapper[4883]: I0310 09:08:01.956235 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.101990 4883 generic.go:334] "Generic (PLEG): container finished" podID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerID="8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80" exitCode=0 Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerDied","Data":"8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80"} Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102641 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.102804 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cwsth" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" containerID="cri-o://3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" gracePeriod=2 Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.108123 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.427707 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.497742 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.497804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.499743 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") pod \"f98b0611-9639-4987-9e9b-0e1c4695a164\" (UID: \"f98b0611-9639-4987-9e9b-0e1c4695a164\") " Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.500499 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities" (OuterVolumeSpecName: "utilities") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.501431 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.505635 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4" (OuterVolumeSpecName: "kube-api-access-cz6z4") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "kube-api-access-cz6z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.543730 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f98b0611-9639-4987-9e9b-0e1c4695a164" (UID: "f98b0611-9639-4987-9e9b-0e1c4695a164"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.603267 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f98b0611-9639-4987-9e9b-0e1c4695a164-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.603305 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz6z4\" (UniqueName: \"kubernetes.io/projected/f98b0611-9639-4987-9e9b-0e1c4695a164-kube-api-access-cz6z4\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.949831 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:02 crc kubenswrapper[4883]: I0310 09:08:02.950345 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dtqw6" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" containerID="cri-o://a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" gracePeriod=2 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.111653 4883 generic.go:334] "Generic (PLEG): container finished" podID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerID="a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" exitCode=0 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.111730 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.114104 4883 generic.go:334] "Generic (PLEG): container finished" podID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" exitCode=0 Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.114347 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwsth" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115503 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115568 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwsth" event={"ID":"f98b0611-9639-4987-9e9b-0e1c4695a164","Type":"ContainerDied","Data":"d460ae50500ac6d093c81ce23834fe61bddb693cf03ca9139dc421ae40806219"} Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.115595 4883 scope.go:117] "RemoveContainer" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.131732 4883 scope.go:117] "RemoveContainer" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.147045 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.159216 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cwsth"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.168095 4883 scope.go:117] "RemoveContainer" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.186732 4883 scope.go:117] "RemoveContainer" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.187424 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": container with ID starting with 3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28 not found: ID does not exist" containerID="3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.187497 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28"} err="failed to get container status \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": rpc error: code = NotFound desc = could not find container \"3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28\": container with ID starting with 3595092d2c8b901ab00d73da207eda0fc1c2e10de1a79d3355ae0b3c7671cc28 not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.187546 4883 scope.go:117] "RemoveContainer" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.187967 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": container with ID starting with e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050 not found: ID does not exist" containerID="e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188017 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050"} err="failed to get container status \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": rpc error: code = NotFound desc = could not find container \"e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050\": container with ID starting with e6746b5902e72bedd228ddb10067ddcf4dd1e8b8001cf80da2b1a1b9d69f4050 not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188050 4883 scope.go:117] "RemoveContainer" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.188659 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": container with ID starting with 09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd not found: ID does not exist" containerID="09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.188691 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd"} err="failed to get container status \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": rpc error: code = NotFound desc = could not find container \"09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd\": container with ID starting with 09940d134834a41e5adac27eed0a15c907b1c15c7a8541c92690c7bb2bd989bd not found: ID does not exist" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.318679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.367171 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413600 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413670 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") pod \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\" (UID: \"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.413779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") pod \"804d1679-c6e1-4594-b067-c41da8ee64ab\" (UID: \"804d1679-c6e1-4594-b067-c41da8ee64ab\") " Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.414365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities" (OuterVolumeSpecName: "utilities") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.418046 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr" (OuterVolumeSpecName: "kube-api-access-7f4xr") pod "c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" (UID: "c07643a3-c0a9-4770-a08e-ab4fb32dfe8e"). InnerVolumeSpecName "kube-api-access-7f4xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.418099 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp" (OuterVolumeSpecName: "kube-api-access-v9hhp") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "kube-api-access-v9hhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.458440 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "804d1679-c6e1-4594-b067-c41da8ee64ab" (UID: "804d1679-c6e1-4594-b067-c41da8ee64ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515683 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515718 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f4xr\" (UniqueName: \"kubernetes.io/projected/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e-kube-api-access-7f4xr\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515732 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9hhp\" (UniqueName: \"kubernetes.io/projected/804d1679-c6e1-4594-b067-c41da8ee64ab-kube-api-access-v9hhp\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.515741 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/804d1679-c6e1-4594-b067-c41da8ee64ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.625599 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.660728 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.687989 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688434 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688460 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688515 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688526 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688543 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688551 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688560 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688570 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688585 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688596 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="extract-content" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688611 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688618 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="extract-utilities" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688628 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688634 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: E0310 09:08:03.688647 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688654 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688818 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688839 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a11b02-0067-46ed-84fe-c764e88b2810" containerName="pruner" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688847 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" containerName="registry-server" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.688858 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" containerName="oc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.689664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.699777 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.703345 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.704145 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719281 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.719455 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.820916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.820998 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.821036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:03 crc kubenswrapper[4883]: I0310 09:08:03.836803 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"installer-9-crc\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.012133 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.025098 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.063768 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.110428 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f98b0611-9639-4987-9e9b-0e1c4695a164" path="/var/lib/kubelet/pods/f98b0611-9639-4987-9e9b-0e1c4695a164/volumes" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" event={"ID":"c07643a3-c0a9-4770-a08e-ab4fb32dfe8e","Type":"ContainerDied","Data":"f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a"} Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122936 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-kn7mm" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.122949 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01f467f7c3e18c4fbdb02f54b5108e0c63dc90a5497bfdf48f4ce08a8ebe34a" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.128162 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dtqw6" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.129549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dtqw6" event={"ID":"804d1679-c6e1-4594-b067-c41da8ee64ab","Type":"ContainerDied","Data":"7fa6f7eaccd4ad67921504c2ac2d68f6a5036f11199646ded41154976ac1c2dc"} Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.129792 4883 scope.go:117] "RemoveContainer" containerID="a4dabd119299b8b7187e538e04ec43348adce3c73e526c74a61e41615d67afab" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.145212 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.148668 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dtqw6"] Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.150258 4883 scope.go:117] "RemoveContainer" containerID="4b273f81ff4709aae90e59a96f5a8a9a4b8f566fb0b34b2f69e7956ff48dc97f" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.172822 4883 scope.go:117] "RemoveContainer" containerID="fbcddb1697b17646b99e5eb8976196f07f60605511e41f78c1cc131392bba69c" Mar 10 09:08:04 crc kubenswrapper[4883]: I0310 09:08:04.383181 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.136409 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerStarted","Data":"3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de"} Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.136778 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerStarted","Data":"552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01"} Mar 10 09:08:05 crc kubenswrapper[4883]: I0310 09:08:05.151528 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.151507874 podStartE2EDuration="2.151507874s" podCreationTimestamp="2026-03-10 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:05.14844562 +0000 UTC m=+271.403343509" watchObservedRunningTime="2026-03-10 09:08:05.151507874 +0000 UTC m=+271.406405763" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.088771 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804d1679-c6e1-4594-b067-c41da8ee64ab" path="/var/lib/kubelet/pods/804d1679-c6e1-4594-b067-c41da8ee64ab/volumes" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.349759 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.350104 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j5rwl" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" containerID="cri-o://a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" gracePeriod=2 Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.716222 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.758870 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") pod \"504544f8-69a4-4562-87d5-fa61335ea052\" (UID: \"504544f8-69a4-4562-87d5-fa61335ea052\") " Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.759258 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities" (OuterVolumeSpecName: "utilities") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.759426 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.764014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75" (OuterVolumeSpecName: "kube-api-access-zfh75") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "kube-api-access-zfh75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.851129 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "504544f8-69a4-4562-87d5-fa61335ea052" (UID: "504544f8-69a4-4562-87d5-fa61335ea052"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.861298 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfh75\" (UniqueName: \"kubernetes.io/projected/504544f8-69a4-4562-87d5-fa61335ea052-kube-api-access-zfh75\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:06 crc kubenswrapper[4883]: I0310 09:08:06.861534 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504544f8-69a4-4562-87d5-fa61335ea052-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151655 4883 generic.go:334] "Generic (PLEG): container finished" podID="504544f8-69a4-4562-87d5-fa61335ea052" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" exitCode=0 Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151705 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151740 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j5rwl" event={"ID":"504544f8-69a4-4562-87d5-fa61335ea052","Type":"ContainerDied","Data":"a0d2e9ecb5144e13f35ee0d106385391edc44733dcd6e7ac3cc5e5bac91f227d"} Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151740 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j5rwl" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.151760 4883 scope.go:117] "RemoveContainer" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.167152 4883 scope.go:117] "RemoveContainer" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.174124 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.184757 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j5rwl"] Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.194849 4883 scope.go:117] "RemoveContainer" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208170 4883 scope.go:117] "RemoveContainer" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.208615 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": container with ID starting with a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942 not found: ID does not exist" containerID="a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208667 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942"} err="failed to get container status \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": rpc error: code = NotFound desc = could not find container \"a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942\": container with ID starting with a50315650d91e3e4922518a8c3b20e600e70deecc9a0f21c96f6c906656ba942 not found: ID does not exist" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.208689 4883 scope.go:117] "RemoveContainer" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.209124 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": container with ID starting with f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f not found: ID does not exist" containerID="f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209163 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f"} err="failed to get container status \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": rpc error: code = NotFound desc = could not find container \"f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f\": container with ID starting with f6031a9d50806c18ea679916ed766388fef8a399b31a4815f63f4f320da9c89f not found: ID does not exist" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209177 4883 scope.go:117] "RemoveContainer" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: E0310 09:08:07.209445 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": container with ID starting with d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11 not found: ID does not exist" containerID="d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11" Mar 10 09:08:07 crc kubenswrapper[4883]: I0310 09:08:07.209508 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11"} err="failed to get container status \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": rpc error: code = NotFound desc = could not find container \"d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11\": container with ID starting with d789bf70ccdfe1cddb9bcb399f3cdf16eabc125f02c7f494a4e1581bc4ea3d11 not found: ID does not exist" Mar 10 09:08:08 crc kubenswrapper[4883]: I0310 09:08:08.086075 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504544f8-69a4-4562-87d5-fa61335ea052" path="/var/lib/kubelet/pods/504544f8-69a4-4562-87d5-fa61335ea052/volumes" Mar 10 09:08:12 crc kubenswrapper[4883]: I0310 09:08:12.688278 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.449524 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450046 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450116 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.450979 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:08:17 crc kubenswrapper[4883]: I0310 09:08:17.451053 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" gracePeriod=600 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.210251 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" exitCode=0 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.210330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8"} Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.211025 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.477054 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.477303 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" containerID="cri-o://2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" gracePeriod=30 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.491314 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.491576 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" containerID="cri-o://691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" gracePeriod=30 Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.991978 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:18 crc kubenswrapper[4883]: I0310 09:08:18.997196 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013225 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013315 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013349 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013406 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013445 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013576 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") pod \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\" (UID: \"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.013621 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") pod \"78f401ec-703b-4789-8453-89b7a572a89a\" (UID: \"78f401ec-703b-4789-8453-89b7a572a89a\") " Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014157 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014239 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014285 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca" (OuterVolumeSpecName: "client-ca") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014350 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config" (OuterVolumeSpecName: "config") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.014377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config" (OuterVolumeSpecName: "config") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.019823 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.020287 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.021300 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545" (OuterVolumeSpecName: "kube-api-access-5s545") pod "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" (UID: "6f1f59ff-042d-4e9f-a4d9-06a1d99492cc"). InnerVolumeSpecName "kube-api-access-5s545". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.021491 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw" (OuterVolumeSpecName: "kube-api-access-5gcgw") pod "78f401ec-703b-4789-8453-89b7a572a89a" (UID: "78f401ec-703b-4789-8453-89b7a572a89a"). InnerVolumeSpecName "kube-api-access-5gcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115807 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78f401ec-703b-4789-8453-89b7a572a89a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115856 4883 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115870 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s545\" (UniqueName: \"kubernetes.io/projected/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-kube-api-access-5s545\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115880 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115896 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115905 4883 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115914 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78f401ec-703b-4789-8453-89b7a572a89a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115922 4883 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.115932 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcgw\" (UniqueName: \"kubernetes.io/projected/78f401ec-703b-4789-8453-89b7a572a89a-kube-api-access-5gcgw\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216318 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" exitCode=0 Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216394 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerDied","Data":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216448 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" event={"ID":"6f1f59ff-042d-4e9f-a4d9-06a1d99492cc","Type":"ContainerDied","Data":"199d7977a0ba3670d2ef5d01bcafbeb1e1c791e7ffa2db6a44ea7cd8d65163fd"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216466 4883 scope.go:117] "RemoveContainer" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.216588 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218608 4883 generic.go:334] "Generic (PLEG): container finished" podID="78f401ec-703b-4789-8453-89b7a572a89a" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" exitCode=0 Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerDied","Data":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218687 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" event={"ID":"78f401ec-703b-4789-8453-89b7a572a89a","Type":"ContainerDied","Data":"f055423efedb6df834dff4c56e0e74a59306fd98bac74cffa507280ee1ff3f83"} Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.218743 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8dd4584bf-swxjl" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230143 4883 scope.go:117] "RemoveContainer" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.230705 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": container with ID starting with 691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb not found: ID does not exist" containerID="691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230742 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb"} err="failed to get container status \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": rpc error: code = NotFound desc = could not find container \"691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb\": container with ID starting with 691c7f6a81b10210708a80d151c09eefc94264984daf090d52e3595b379f1fdb not found: ID does not exist" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.230765 4883 scope.go:117] "RemoveContainer" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.240347 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.241878 4883 scope.go:117] "RemoveContainer" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.242662 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": container with ID starting with 2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5 not found: ID does not exist" containerID="2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.242707 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5"} err="failed to get container status \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": rpc error: code = NotFound desc = could not find container \"2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5\": container with ID starting with 2be277c2761f98bc9200dbc05858688b660bb7d005841510f7221082be81feb5 not found: ID does not exist" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.244521 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5b795987-9m4tj"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.249694 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.252987 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8dd4584bf-swxjl"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.975529 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.975990 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-content" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976010 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-content" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976038 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976044 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976054 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-utilities" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976062 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="extract-utilities" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976076 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976082 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: E0310 09:08:19.976090 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976098 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976304 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f401ec-703b-4789-8453-89b7a572a89a" containerName="controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976315 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="504544f8-69a4-4562-87d5-fa61335ea052" containerName="registry-server" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.976341 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" containerName="route-controller-manager" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.977097 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.979100 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980239 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.980921 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981062 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981166 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.981595 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.983625 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.983787 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.984230 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.984392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.985058 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.985302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.989884 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.991289 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:19 crc kubenswrapper[4883]: I0310 09:08:19.996883 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.086983 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1f59ff-042d-4e9f-a4d9-06a1d99492cc" path="/var/lib/kubelet/pods/6f1f59ff-042d-4e9f-a4d9-06a1d99492cc/volumes" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.087753 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f401ec-703b-4789-8453-89b7a572a89a" path="/var/lib/kubelet/pods/78f401ec-703b-4789-8453-89b7a572a89a/volumes" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131599 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131641 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131701 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131732 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131748 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.131853 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233320 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233428 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233465 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233537 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.233581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-config\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234790 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-client-ca\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.234995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dd6ea4-f278-458c-b9e2-93085190d1b3-client-ca\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.235250 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-proxy-ca-bundles\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.235422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-config\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.240813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dd6ea4-f278-458c-b9e2-93085190d1b3-serving-cert\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.242604 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-serving-cert\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.248350 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdf4\" (UniqueName: \"kubernetes.io/projected/16dd6ea4-f278-458c-b9e2-93085190d1b3-kube-api-access-hbdf4\") pod \"route-controller-manager-568ddf66f-9vpqv\" (UID: \"16dd6ea4-f278-458c-b9e2-93085190d1b3\") " pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.249877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsz8r\" (UniqueName: \"kubernetes.io/projected/d8dbef90-2f02-4ca8-9c38-cbb026c82e5b-kube-api-access-hsz8r\") pod \"controller-manager-7b6c7d57cc-k595r\" (UID: \"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b\") " pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.302402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.310829 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.688312 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv"] Mar 10 09:08:20 crc kubenswrapper[4883]: W0310 09:08:20.696096 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dd6ea4_f278_458c_b9e2_93085190d1b3.slice/crio-8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c WatchSource:0}: Error finding container 8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c: Status 404 returned error can't find the container with id 8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c Mar 10 09:08:20 crc kubenswrapper[4883]: I0310 09:08:20.736191 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7d57cc-k595r"] Mar 10 09:08:20 crc kubenswrapper[4883]: W0310 09:08:20.738361 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8dbef90_2f02_4ca8_9c38_cbb026c82e5b.slice/crio-a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c WatchSource:0}: Error finding container a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c: Status 404 returned error can't find the container with id a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234716 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" event={"ID":"16dd6ea4-f278-458c-b9e2-93085190d1b3","Type":"ContainerStarted","Data":"5c961f531b0dc68502c29b99ffd643a7d1ee02388ec051eb965974afd188944f"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234765 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" event={"ID":"16dd6ea4-f278-458c-b9e2-93085190d1b3","Type":"ContainerStarted","Data":"8dedfd1b93d833fac5844ad449c87c891ec39ca513f553f0b4d046e63e09495c"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.234935 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.236822 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" event={"ID":"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b","Type":"ContainerStarted","Data":"1533e92453d6691b6336f7c80535c9c0fc5afda3dc6cb698c4a2e15dd630f784"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.236854 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" event={"ID":"d8dbef90-2f02-4ca8-9c38-cbb026c82e5b","Type":"ContainerStarted","Data":"a18a4be07e0b7004fbad321acf29391a6a83963183f58c90b51d99a5d576973c"} Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.237026 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.240263 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.251189 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" podStartSLOduration=3.251177794 podStartE2EDuration="3.251177794s" podCreationTimestamp="2026-03-10 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:21.248774298 +0000 UTC m=+287.503672188" watchObservedRunningTime="2026-03-10 09:08:21.251177794 +0000 UTC m=+287.506075682" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.251645 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-568ddf66f-9vpqv" Mar 10 09:08:21 crc kubenswrapper[4883]: I0310 09:08:21.266552 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b6c7d57cc-k595r" podStartSLOduration=3.266539435 podStartE2EDuration="3.266539435s" podCreationTimestamp="2026-03-10 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:21.265012816 +0000 UTC m=+287.519910706" watchObservedRunningTime="2026-03-10 09:08:21.266539435 +0000 UTC m=+287.521437325" Mar 10 09:08:37 crc kubenswrapper[4883]: I0310 09:08:37.710055 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" containerID="cri-o://9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" gracePeriod=15 Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.129238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: E0310 09:08:38.156886 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156906 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.156995 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerName="oauth-openshift" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.157353 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.167843 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240349 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240441 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241154 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.240522 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241199 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241289 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241305 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241330 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241395 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") pod \"90f9a6f1-0760-4398-80cf-70c615c7032d\" (UID: \"90f9a6f1-0760-4398-80cf-70c615c7032d\") " Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241505 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241542 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241599 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241680 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241879 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241909 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241939 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241958 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.241996 4883 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242073 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.242151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246461 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246910 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.246954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b" (OuterVolumeSpecName: "kube-api-access-9lm8b") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "kube-api-access-9lm8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247206 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247639 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247817 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.247955 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.248050 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "90f9a6f1-0760-4398-80cf-70c615c7032d" (UID: "90f9a6f1-0760-4398-80cf-70c615c7032d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334180 4883 generic.go:334] "Generic (PLEG): container finished" podID="90f9a6f1-0760-4398-80cf-70c615c7032d" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" exitCode=0 Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334263 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerDied","Data":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334279 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334320 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m7986" event={"ID":"90f9a6f1-0760-4398-80cf-70c615c7032d","Type":"ContainerDied","Data":"4fcf975f26107b7cfd1ff1be2d34f1e281e19924c7820362af5907d5ba2ac3dc"} Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.334349 4883 scope.go:117] "RemoveContainer" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342869 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.342968 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343045 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343200 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343432 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343535 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343647 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344008 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344077 4883 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344133 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344178 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-policies\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343735 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.343744 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27d82da9-1a67-4bb0-9a5b-21e2642140bf-audit-dir\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344196 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344394 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344454 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344554 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344614 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344693 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344756 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lm8b\" (UniqueName: \"kubernetes.io/projected/90f9a6f1-0760-4398-80cf-70c615c7032d-kube-api-access-9lm8b\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344801 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344812 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344915 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.344943 4883 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90f9a6f1-0760-4398-80cf-70c615c7032d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346366 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-error\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346903 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.346895 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.347309 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.347968 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.348119 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-user-template-login\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.351058 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27d82da9-1a67-4bb0-9a5b-21e2642140bf-v4-0-config-system-session\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.356375 4883 scope.go:117] "RemoveContainer" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: E0310 09:08:38.356710 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": container with ID starting with 9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03 not found: ID does not exist" containerID="9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.356743 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03"} err="failed to get container status \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": rpc error: code = NotFound desc = could not find container \"9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03\": container with ID starting with 9dbfdedc537ae51778d1f84c7fb3f68fcd62cc9002fc352ce12930960801fd03 not found: ID does not exist" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.360450 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtvx\" (UniqueName: \"kubernetes.io/projected/27d82da9-1a67-4bb0-9a5b-21e2642140bf-kube-api-access-kdtvx\") pod \"oauth-openshift-66f68474cb-cmj6s\" (UID: \"27d82da9-1a67-4bb0-9a5b-21e2642140bf\") " pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.364889 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.368402 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m7986"] Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.472357 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:38 crc kubenswrapper[4883]: I0310 09:08:38.858074 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f68474cb-cmj6s"] Mar 10 09:08:38 crc kubenswrapper[4883]: W0310 09:08:38.866086 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d82da9_1a67_4bb0_9a5b_21e2642140bf.slice/crio-b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4 WatchSource:0}: Error finding container b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4: Status 404 returned error can't find the container with id b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4 Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" event={"ID":"27d82da9-1a67-4bb0-9a5b-21e2642140bf","Type":"ContainerStarted","Data":"d78b6114971fd5b7b5f2587d1a0058b5af2181eb8c8460aacfe68b636d07ce7a"} Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345418 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.345433 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" event={"ID":"27d82da9-1a67-4bb0-9a5b-21e2642140bf","Type":"ContainerStarted","Data":"b5e5fe23b1e46f4d08b8b6205a028f87b98cb2dc3adfc1858280ee29a164cef4"} Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.350011 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" Mar 10 09:08:39 crc kubenswrapper[4883]: I0310 09:08:39.376354 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f68474cb-cmj6s" podStartSLOduration=27.376325152 podStartE2EDuration="27.376325152s" podCreationTimestamp="2026-03-10 09:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:08:39.374036895 +0000 UTC m=+305.628934784" watchObservedRunningTime="2026-03-10 09:08:39.376325152 +0000 UTC m=+305.631223042" Mar 10 09:08:40 crc kubenswrapper[4883]: I0310 09:08:40.086658 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f9a6f1-0760-4398-80cf-70c615c7032d" path="/var/lib/kubelet/pods/90f9a6f1-0760-4398-80cf-70c615c7032d/volumes" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.377488 4883 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378292 4883 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378427 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378625 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378685 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378730 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378688 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.378748 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" gracePeriod=15 Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.379847 4883 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380279 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380323 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380334 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380341 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380750 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380771 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380789 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380797 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380803 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380819 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380825 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380832 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380838 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.380846 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.380852 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381032 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381043 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381053 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381059 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381066 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381074 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381083 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381091 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.381191 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381199 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: E0310 09:08:42.381208 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381214 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.381347 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397074 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397100 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397398 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.397464 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504445 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504660 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504508 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504626 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504384 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504791 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504918 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.504990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505056 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505092 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505284 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505381 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:42 crc kubenswrapper[4883]: I0310 09:08:42.505399 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.245316 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.246119 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.246795 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247100 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247455 4883 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.247510 4883 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.247734 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="200ms" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.369466 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.370922 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371727 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371754 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371763 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371771 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" exitCode=2 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.371843 4883 scope.go:117] "RemoveContainer" containerID="c97a0bb60671b8a1e8c4bcfb4a44f9cfbe07b966bb3b0374a4dcb3202510725a" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.373132 4883 generic.go:334] "Generic (PLEG): container finished" podID="176e0284-eb7e-40ac-8466-c3fab8836176" containerID="3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de" exitCode=0 Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.373300 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerDied","Data":"3787b6ad2bf41b839ef8b984df31432624a52240182015408ed618baf0c018de"} Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.374011 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: I0310 09:08:43.374322 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.448405 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="400ms" Mar 10 09:08:43 crc kubenswrapper[4883]: E0310 09:08:43.849263 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="800ms" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.082349 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.082665 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.383108 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:44 crc kubenswrapper[4883]: E0310 09:08:44.650341 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="1.6s" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.710229 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.710898 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711342 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711530 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711530 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.711938 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.712113 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727168 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727220 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock" (OuterVolumeSpecName: "var-lock") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727306 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727306 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727346 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727360 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727380 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") pod \"176e0284-eb7e-40ac-8466-c3fab8836176\" (UID: \"176e0284-eb7e-40ac-8466-c3fab8836176\") " Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727618 4883 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727637 4883 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727645 4883 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727654 4883 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.727663 4883 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/176e0284-eb7e-40ac-8466-c3fab8836176-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.732136 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "176e0284-eb7e-40ac-8466-c3fab8836176" (UID: "176e0284-eb7e-40ac-8466-c3fab8836176"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:44 crc kubenswrapper[4883]: I0310 09:08:44.828713 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/176e0284-eb7e-40ac-8466-c3fab8836176-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390781 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"176e0284-eb7e-40ac-8466-c3fab8836176","Type":"ContainerDied","Data":"552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01"} Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.390903 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552a1b11d60a599de3abec512c8277695d95d44418766139907a9fdf62fd5f01" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.394162 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.394990 4883 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" exitCode=0 Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.395058 4883 scope.go:117] "RemoveContainer" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.395071 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.404000 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.404312 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.408736 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.409130 4883 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.412299 4883 scope.go:117] "RemoveContainer" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.425159 4883 scope.go:117] "RemoveContainer" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.438169 4883 scope.go:117] "RemoveContainer" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.454967 4883 scope.go:117] "RemoveContainer" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.475111 4883 scope.go:117] "RemoveContainer" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.494586 4883 scope.go:117] "RemoveContainer" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.495030 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": container with ID starting with 65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea not found: ID does not exist" containerID="65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495130 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea"} err="failed to get container status \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": rpc error: code = NotFound desc = could not find container \"65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea\": container with ID starting with 65470d6922cce4b8539321d721eab7788826b1e2944433c49739d82ba07581ea not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495223 4883 scope.go:117] "RemoveContainer" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.495598 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": container with ID starting with 1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0 not found: ID does not exist" containerID="1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495649 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0"} err="failed to get container status \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": rpc error: code = NotFound desc = could not find container \"1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0\": container with ID starting with 1eeb62f7eeecb42b8663ce6dcb045a86e139bf68568593a9da60fa26787757a0 not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.495687 4883 scope.go:117] "RemoveContainer" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496084 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": container with ID starting with 27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef not found: ID does not exist" containerID="27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496151 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef"} err="failed to get container status \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": rpc error: code = NotFound desc = could not find container \"27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef\": container with ID starting with 27e28e935d6266eca51e44c47a789be56437d686d316e48b22fef90f0555aeef not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496181 4883 scope.go:117] "RemoveContainer" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496439 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": container with ID starting with 32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada not found: ID does not exist" containerID="32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496552 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada"} err="failed to get container status \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": rpc error: code = NotFound desc = could not find container \"32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada\": container with ID starting with 32fdfe342dd6333f159b750c3e142ad4eafb78951a6a86c8b69c6801eccefada not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.496647 4883 scope.go:117] "RemoveContainer" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.496979 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": container with ID starting with 6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3 not found: ID does not exist" containerID="6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497017 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3"} err="failed to get container status \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": rpc error: code = NotFound desc = could not find container \"6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3\": container with ID starting with 6f498bd5f5a49c5b45c2b263ca2ae2ac463778407253b03598ee48d3ad8c03d3 not found: ID does not exist" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497036 4883 scope.go:117] "RemoveContainer" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: E0310 09:08:45.497315 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": container with ID starting with 0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766 not found: ID does not exist" containerID="0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766" Mar 10 09:08:45 crc kubenswrapper[4883]: I0310 09:08:45.497398 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766"} err="failed to get container status \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": rpc error: code = NotFound desc = could not find container \"0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766\": container with ID starting with 0e320f5933d372887ccd0708b5a6f8e771e6c14939d45a9647bc24210e171766 not found: ID does not exist" Mar 10 09:08:46 crc kubenswrapper[4883]: I0310 09:08:46.086781 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 09:08:46 crc kubenswrapper[4883]: E0310 09:08:46.251251 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="3.2s" Mar 10 09:08:47 crc kubenswrapper[4883]: E0310 09:08:47.405992 4883 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:47 crc kubenswrapper[4883]: I0310 09:08:47.406385 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:47 crc kubenswrapper[4883]: E0310 09:08:47.426437 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.140:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b6fbb0a7e0002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,LastTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.414656 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97"} Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.415329 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ff954ebccd801344dbf231359e366102e3322c626868177138aa0408ffe61662"} Mar 10 09:08:48 crc kubenswrapper[4883]: I0310 09:08:48.415910 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:48 crc kubenswrapper[4883]: E0310 09:08:48.415935 4883 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:08:49 crc kubenswrapper[4883]: E0310 09:08:49.172642 4883 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" volumeName="registry-storage" Mar 10 09:08:49 crc kubenswrapper[4883]: E0310 09:08:49.452030 4883 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.140:6443: connect: connection refused" interval="6.4s" Mar 10 09:08:50 crc kubenswrapper[4883]: E0310 09:08:50.286798 4883 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.26.140:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b6fbb0a7e0002 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,LastTimestamp:2026-03-10 09:08:47.42601933 +0000 UTC m=+313.680917219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.079110 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.083029 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.083724 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.097091 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.097121 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: E0310 09:08:54.097518 4883 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.098156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: W0310 09:08:54.118663 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07 WatchSource:0}: Error finding container 389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07: Status 404 returned error can't find the container with id 389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07 Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447208 4883 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8029248d3ffac147bea5f21b2ed5a173034cb51098695d4ac91961b6196bb600" exitCode=0 Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447283 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8029248d3ffac147bea5f21b2ed5a173034cb51098695d4ac91961b6196bb600"} Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447324 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"389531043a26ad96c805ceda0574d811641b4cb78750239adfc35de16612cc07"} Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447646 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.447670 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:54 crc kubenswrapper[4883]: E0310 09:08:54.447989 4883 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:54 crc kubenswrapper[4883]: I0310 09:08:54.448182 4883 status_manager.go:851] "Failed to get status for pod" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.140:6443: connect: connection refused" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.455621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1096c1d9a2e294bd0f61f15e072645e1504385c7def66cf789514a63b6c06f52"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456503 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4adc5f80e7505865a41c60abde887ffad91bc1bbfd89c4ae7c63b44399d82f8d"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456594 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6dadc55f0b768da84020856b89d9fb67477618930f565bc9642fbf5df6feef7c"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456689 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0394f59209da83456f47f287d8b4f4a78fba4b3726b0e9b2ae219187bcb0f9e0"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9b4abf49a6e034f53ba327df4a6bd7a7b41695dff1f9e643121ab43c2a7ef748"} Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457327 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.456912 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:55 crc kubenswrapper[4883]: I0310 09:08:55.457469 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464527 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464910 4883 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d" exitCode=1 Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.464966 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d"} Mar 10 09:08:56 crc kubenswrapper[4883]: I0310 09:08:56.465513 4883 scope.go:117] "RemoveContainer" containerID="bdb719f85a0acb53d5952a65acccbbc8f7223bc45f14306e275c4abba97d0a3d" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.081666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.081811 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.083758 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.084191 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.093078 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.099551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183617 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.183648 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.185375 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.185415 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.195791 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.197377 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd6597a3-f861-4126-933e-d6134c8bd4b5-metrics-certs\") pod \"network-metrics-daemon-gmq5n\" (UID: \"bd6597a3-f861-4126-933e-d6134c8bd4b5\") " pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.207674 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.209518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.293491 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.300891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.306188 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.316122 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.323219 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gmq5n" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.336806 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.473497 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 09:08:57 crc kubenswrapper[4883]: I0310 09:08:57.473823 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a1732939c8e095f503ee44134e4761109af876024b9ab37c7ad425d9188de80"} Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.696036 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233 WatchSource:0}: Error finding container 93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233: Status 404 returned error can't find the container with id 93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233 Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.760453 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef WatchSource:0}: Error finding container f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef: Status 404 returned error can't find the container with id f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.761326 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6597a3_f861_4126_933e_d6134c8bd4b5.slice/crio-88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a WatchSource:0}: Error finding container 88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a: Status 404 returned error can't find the container with id 88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a Mar 10 09:08:57 crc kubenswrapper[4883]: W0310 09:08:57.818557 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79 WatchSource:0}: Error finding container e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79: Status 404 returned error can't find the container with id e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79 Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.482447 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.482813 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f171ec1b2ea41f28f78790c4aec7dd7ceb311d00891ba24174962b8ccedae6ef"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484264 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"fb5e6257382d0578ecf8afbbee721052318816ef87140f1c0a174043a46565cb"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"126f51dfe48fb824b24ab2e48acfd441c2e681f0f96f9335a45ea7e2a76d444d"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.484303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gmq5n" event={"ID":"bd6597a3-f861-4126-933e-d6134c8bd4b5","Type":"ContainerStarted","Data":"88207ec306f38ab39b5886f82a94d8b243de610e5bdb6dfbbdd0ada1283e2c6a"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.485749 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8886c140227ff327084d724ef890173f33c85a37c593f364844b5ab4ac1d19a5"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.485776 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93592c8394a193177fa3ac76ceadaba7a7de8fdca29bbf09ca46c3378064b233"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.487904 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c472bcd9d9c0472f328d6c1f36187b89d0b8d20951fdcd1717c02bd75540191d"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.487932 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e8d6bf354911c182324ecc57e5227fb11ece20603ab857a5d5be1df95b79cc79"} Mar 10 09:08:58 crc kubenswrapper[4883]: I0310 09:08:58.488232 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.098327 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.098375 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.103486 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493646 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493708 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" exitCode=255 Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.493804 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b"} Mar 10 09:08:59 crc kubenswrapper[4883]: I0310 09:08:59.494415 4883 scope.go:117] "RemoveContainer" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.505209 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506110 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506170 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" exitCode=255 Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506214 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a"} Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.506270 4883 scope.go:117] "RemoveContainer" containerID="9339e786ed636c2c5d6bb74a149fe15a84fa7912d4e3ee24ed502f1c95e5fc2b" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.507093 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:00 crc kubenswrapper[4883]: E0310 09:09:00.507502 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:00 crc kubenswrapper[4883]: I0310 09:09:00.748619 4883 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515033 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515491 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.515513 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:01 crc kubenswrapper[4883]: I0310 09:09:01.519060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:02 crc kubenswrapper[4883]: I0310 09:09:02.520727 4883 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:02 crc kubenswrapper[4883]: I0310 09:09:02.521334 4883 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="16dbc720-9dd7-4b18-a9da-02204c723f2c" Mar 10 09:09:04 crc kubenswrapper[4883]: I0310 09:09:04.106309 4883 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ddcb348f-b8d8-4643-9dae-15e77c150ea7" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.188661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.197839 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:05 crc kubenswrapper[4883]: I0310 09:09:05.537174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:06 crc kubenswrapper[4883]: I0310 09:09:06.546766 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 09:09:08 crc kubenswrapper[4883]: I0310 09:09:08.398763 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 09:09:08 crc kubenswrapper[4883]: I0310 09:09:08.649328 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.370364 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.732573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 09:09:09 crc kubenswrapper[4883]: I0310 09:09:09.908221 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 09:09:11 crc kubenswrapper[4883]: I0310 09:09:11.631096 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 09:09:11 crc kubenswrapper[4883]: I0310 09:09:11.981692 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.157627 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.284591 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.367209 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.407388 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.471219 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.720775 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 09:09:12 crc kubenswrapper[4883]: I0310 09:09:12.812366 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.298899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.319921 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.328568 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.664791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.943798 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 09:09:13 crc kubenswrapper[4883]: I0310 09:09:13.962027 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.086917 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.212669 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.285695 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.333149 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.584351 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.584687 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3"} Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.760807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 09:09:14 crc kubenswrapper[4883]: I0310 09:09:14.791242 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.127235 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.180689 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.266714 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.302947 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.428709 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.465877 4883 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595020 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595566 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595618 4883 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" exitCode=255 Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3"} Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.595718 4883 scope.go:117] "RemoveContainer" containerID="6fa32de7977b89f96045125a3065be9fd9aab19a84c1cfa8b0dbee07faa2179a" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.596298 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:15 crc kubenswrapper[4883]: E0310 09:09:15.597264 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.608156 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.676843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.731231 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.843590 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.902694 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 09:09:15 crc kubenswrapper[4883]: I0310 09:09:15.936495 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.066897 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.107588 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.173674 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.254560 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.419261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.420398 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.455425 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.470441 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.507896 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.512072 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.520981 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.541391 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.583158 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.603917 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.620697 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.666645 4883 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.936387 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 09:09:16 crc kubenswrapper[4883]: I0310 09:09:16.938738 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.171705 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.182118 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.252761 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.276877 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.336048 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.373060 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.395191 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.518091 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.535904 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.636559 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.756244 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.802504 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.860934 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.861939 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 09:09:17 crc kubenswrapper[4883]: I0310 09:09:17.896378 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.005628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.160187 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.210252 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.257210 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.260345 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.277201 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.319067 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.365728 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.483357 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.608807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.711229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.735307 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.758235 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.789954 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.792612 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.812093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.827856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.863092 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 09:09:18 crc kubenswrapper[4883]: I0310 09:09:18.870824 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.003514 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.196261 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.363555 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.423410 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.483333 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.509327 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.520843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.552152 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.559633 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.648697 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.662006 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.667843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.670378 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.685815 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.691938 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.706942 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.736364 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 09:09:19 crc kubenswrapper[4883]: I0310 09:09:19.836608 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.020026 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.020584 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.021341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.066025 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.120552 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.134253 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.140923 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.240532 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.247930 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.269336 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.390416 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.441953 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.450581 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.467467 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.476040 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.585980 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.600243 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.671236 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.717262 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.753340 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.908927 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.913466 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 09:09:20 crc kubenswrapper[4883]: I0310 09:09:20.993789 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.039605 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.063038 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.064624 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.180138 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.258919 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.282764 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.363409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.553155 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.588801 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.602575 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.641525 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.649831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.667194 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.667214 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.739535 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.752328 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.759663 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.770314 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.782883 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.921760 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 09:09:21 crc kubenswrapper[4883]: I0310 09:09:21.922374 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.040196 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.055525 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.077044 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.115523 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.119057 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.170102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.211237 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.392215 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.472252 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.475140 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.507304 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.545543 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.674687 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.810816 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.850256 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.890909 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.895738 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 09:09:22 crc kubenswrapper[4883]: I0310 09:09:22.897101 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.012983 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.074234 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.257693 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.354018 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.607226 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.608898 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.681751 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.682636 4883 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.751073 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.820632 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.880096 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.895578 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 09:09:23 crc kubenswrapper[4883]: I0310 09:09:23.952524 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.099998 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.277629 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.363359 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.378048 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.538929 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.611783 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.777258 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.850277 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.909896 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.924085 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 09:09:24 crc kubenswrapper[4883]: I0310 09:09:24.965612 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.031033 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.036369 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.195731 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.196537 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.208683 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.252181 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.279288 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.378070 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.443204 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.535523 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.651690 4883 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.836966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 09:09:25 crc kubenswrapper[4883]: I0310 09:09:25.988189 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.045892 4883 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.080878 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:26 crc kubenswrapper[4883]: E0310 09:09:26.081182 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.103415 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.116886 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.133273 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.171786 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.206990 4883 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.209946 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gmq5n" podStartSLOduration=324.209924245 podStartE2EDuration="5m24.209924245s" podCreationTimestamp="2026-03-10 09:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:09:00.812254309 +0000 UTC m=+327.067152198" watchObservedRunningTime="2026-03-10 09:09:26.209924245 +0000 UTC m=+352.464822133" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213016 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213082 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.213104 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gmq5n"] Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.284004 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.283982875 podStartE2EDuration="26.283982875s" podCreationTimestamp="2026-03-10 09:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:09:26.235994613 +0000 UTC m=+352.490892502" watchObservedRunningTime="2026-03-10 09:09:26.283982875 +0000 UTC m=+352.538880765" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.292115 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.405981 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.428302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.455533 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.467963 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.703010 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.866689 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.872758 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.899305 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 09:09:26 crc kubenswrapper[4883]: I0310 09:09:26.928534 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.153218 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.252497 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.310951 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.322506 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.357737 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.510373 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.528086 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.548673 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.569023 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.695597 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 09:09:27 crc kubenswrapper[4883]: I0310 09:09:27.802808 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.014002 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.018211 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.142899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.262831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.592515 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.882859 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 09:09:28 crc kubenswrapper[4883]: I0310 09:09:28.906162 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.134116 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.205626 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.627608 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 09:09:29 crc kubenswrapper[4883]: I0310 09:09:29.779266 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.135637 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.626625 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 09:09:30 crc kubenswrapper[4883]: I0310 09:09:30.863249 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 09:09:31 crc kubenswrapper[4883]: I0310 09:09:31.144993 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 09:09:34 crc kubenswrapper[4883]: I0310 09:09:34.103130 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.080708 4883 scope.go:117] "RemoveContainer" containerID="b86cdaae54af4f3e1175a1bee3cfee793677557148c998765199876e4808e6c3" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.773533 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 09:09:41 crc kubenswrapper[4883]: I0310 09:09:41.774049 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"29f3b75738c92fd6855829ecb4d34df14b21666f45767a69c52a52332ffbfc0a"} Mar 10 09:09:44 crc kubenswrapper[4883]: I0310 09:09:44.823388 4883 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 09:09:44 crc kubenswrapper[4883]: I0310 09:09:44.824093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" gracePeriod=5 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.378083 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.378432 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.438946 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439086 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439160 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439194 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439221 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439555 4883 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439570 4883 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439579 4883 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.439586 4883 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.445992 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.540680 4883 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825023 4883 generic.go:334] "Generic (PLEG): container finished" podID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" exitCode=0 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.825750 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827223 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827383 4883 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" exitCode=137 Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827438 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.827514 4883 scope.go:117] "RemoveContainer" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.844710 4883 scope.go:117] "RemoveContainer" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: E0310 09:09:50.845032 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": container with ID starting with 2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97 not found: ID does not exist" containerID="2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97" Mar 10 09:09:50 crc kubenswrapper[4883]: I0310 09:09:50.845061 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97"} err="failed to get container status \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": rpc error: code = NotFound desc = could not find container \"2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97\": container with ID starting with 2215a4ad4ca7961da0e30bc0a9712d3e6c47d54c7720db191f0fd5ad50935c97 not found: ID does not exist" Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.838806 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerStarted","Data":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.839414 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:09:51 crc kubenswrapper[4883]: I0310 09:09:51.840422 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:09:52 crc kubenswrapper[4883]: I0310 09:09:52.086386 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.156347 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:00 crc kubenswrapper[4883]: E0310 09:10:00.157235 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157254 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: E0310 09:10:00.157276 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157282 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157435 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.157461 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="176e0284-eb7e-40ac-8466-c3fab8836176" containerName="installer" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.158118 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.161351 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.161799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.162035 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.163612 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.249352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.350916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.369766 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"auto-csr-approver-29552230-7n2zm\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.475403 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:00 crc kubenswrapper[4883]: I0310 09:10:00.905355 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:10:01 crc kubenswrapper[4883]: I0310 09:10:01.896799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerStarted","Data":"767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9"} Mar 10 09:10:02 crc kubenswrapper[4883]: I0310 09:10:02.904430 4883 generic.go:334] "Generic (PLEG): container finished" podID="9e54057e-1436-403c-bd92-66dbb888b129" containerID="7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639" exitCode=0 Mar 10 09:10:02 crc kubenswrapper[4883]: I0310 09:10:02.904490 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerDied","Data":"7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639"} Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.129619 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.307308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") pod \"9e54057e-1436-403c-bd92-66dbb888b129\" (UID: \"9e54057e-1436-403c-bd92-66dbb888b129\") " Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.314091 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk" (OuterVolumeSpecName: "kube-api-access-2wqvk") pod "9e54057e-1436-403c-bd92-66dbb888b129" (UID: "9e54057e-1436-403c-bd92-66dbb888b129"). InnerVolumeSpecName "kube-api-access-2wqvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.409584 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wqvk\" (UniqueName: \"kubernetes.io/projected/9e54057e-1436-403c-bd92-66dbb888b129-kube-api-access-2wqvk\") on node \"crc\" DevicePath \"\"" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" event={"ID":"9e54057e-1436-403c-bd92-66dbb888b129","Type":"ContainerDied","Data":"767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9"} Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920775 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="767f188be1f37ce6cc0246fa99cfced7d4e1bfc135c1ee0ba6a4061b69e0bdc9" Mar 10 09:10:04 crc kubenswrapper[4883]: I0310 09:10:04.920803 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-7n2zm" Mar 10 09:10:17 crc kubenswrapper[4883]: I0310 09:10:17.449175 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:10:17 crc kubenswrapper[4883]: I0310 09:10:17.449567 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.300961 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:46 crc kubenswrapper[4883]: E0310 09:10:46.302158 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302181 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302343 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e54057e-1436-403c-bd92-66dbb888b129" containerName="oc" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.302994 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.316714 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.499929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500089 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500117 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500162 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.500527 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.520826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602354 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602459 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602504 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602525 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602636 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.602945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.603809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-trusted-ca\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.603938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-certificates\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.609707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-registry-tls\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.609904 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.618504 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-bound-sa-token\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.618699 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw4j6\" (UniqueName: \"kubernetes.io/projected/57ad81e1-b9ef-405b-bb22-5d496f5d56c6-kube-api-access-hw4j6\") pod \"image-registry-66df7c8f76-x4jrh\" (UID: \"57ad81e1-b9ef-405b-bb22-5d496f5d56c6\") " pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.621528 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:46 crc kubenswrapper[4883]: I0310 09:10:46.792468 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-x4jrh"] Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" event={"ID":"57ad81e1-b9ef-405b-bb22-5d496f5d56c6","Type":"ContainerStarted","Data":"793cb221e2dd05df8446d3fb0ae59d14621bd59f896a59a8407e93e9eb044c21"} Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148686 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" event={"ID":"57ad81e1-b9ef-405b-bb22-5d496f5d56c6","Type":"ContainerStarted","Data":"90a4684ca3ee0ed5f5d406ac2b6e0dc42a95002dd843bc4e536da9c9a5bca05f"} Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.148705 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.167679 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" podStartSLOduration=1.16765426 podStartE2EDuration="1.16765426s" podCreationTimestamp="2026-03-10 09:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:10:47.165858352 +0000 UTC m=+433.420756241" watchObservedRunningTime="2026-03-10 09:10:47.16765426 +0000 UTC m=+433.422552149" Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.448651 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:10:47 crc kubenswrapper[4883]: I0310 09:10:47.448720 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:11:06 crc kubenswrapper[4883]: I0310 09:11:06.625625 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-x4jrh" Mar 10 09:11:06 crc kubenswrapper[4883]: I0310 09:11:06.665400 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.660581 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.663443 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ltgv7" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" containerID="cri-o://91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.667551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.667953 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tlhr4" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" containerID="cri-o://f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.675703 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.676045 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" containerID="cri-o://5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.696171 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.696509 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2h5dv" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" containerID="cri-o://88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.699418 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.699703 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vhnvt" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" containerID="cri-o://c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" gracePeriod=30 Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.701922 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.702619 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.716637 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.880928 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981446 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.981581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.982872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.986701 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/849aec1a-3ce6-4153-8e52-4bf0185e29e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:15 crc kubenswrapper[4883]: I0310 09:11:15.995631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlfj\" (UniqueName: \"kubernetes.io/projected/849aec1a-3ce6-4153-8e52-4bf0185e29e3-kube-api-access-6qlfj\") pod \"marketplace-operator-79b997595-9d6jf\" (UID: \"849aec1a-3ce6-4153-8e52-4bf0185e29e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.017214 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.131757 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.138688 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.144611 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.162254 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.168037 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287256 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287285 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287323 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287353 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287398 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287423 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287446 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") pod \"7746695d-3e1f-455d-9acc-dffdba42c0d5\" (UID: \"7746695d-3e1f-455d-9acc-dffdba42c0d5\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287495 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287514 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") pod \"816c3b00-c481-4c08-9691-0244d3c044e3\" (UID: \"816c3b00-c481-4c08-9691-0244d3c044e3\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287534 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287564 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287594 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") pod \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\" (UID: \"74014fb3-ee38-481a-a27f-f12ff7f2c29a\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287611 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") pod \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\" (UID: \"fa724d40-49c8-4d1d-a7e9-5af8f0603e19\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.287630 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") pod \"740631be-94cf-4c75-a5a3-0dbd57e2e510\" (UID: \"740631be-94cf-4c75-a5a3-0dbd57e2e510\") " Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288110 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities" (OuterVolumeSpecName: "utilities") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288562 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities" (OuterVolumeSpecName: "utilities") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.288903 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.289589 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities" (OuterVolumeSpecName: "utilities") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.292879 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities" (OuterVolumeSpecName: "utilities") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.294645 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4" (OuterVolumeSpecName: "kube-api-access-85hl4") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "kube-api-access-85hl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.294921 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.295137 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n" (OuterVolumeSpecName: "kube-api-access-p5m4n") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "kube-api-access-p5m4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.295871 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46" (OuterVolumeSpecName: "kube-api-access-djb46") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "kube-api-access-djb46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.299692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5" (OuterVolumeSpecName: "kube-api-access-xvxh5") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "kube-api-access-xvxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.300198 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp" (OuterVolumeSpecName: "kube-api-access-lldvp") pod "74014fb3-ee38-481a-a27f-f12ff7f2c29a" (UID: "74014fb3-ee38-481a-a27f-f12ff7f2c29a"). InnerVolumeSpecName "kube-api-access-lldvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.312466 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7746695d-3e1f-455d-9acc-dffdba42c0d5" (UID: "7746695d-3e1f-455d-9acc-dffdba42c0d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.342995 4883 generic.go:334] "Generic (PLEG): container finished" podID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343057 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343431 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" event={"ID":"74014fb3-ee38-481a-a27f-f12ff7f2c29a","Type":"ContainerDied","Data":"ee1e42ffe97556105d0510c897a1238a2dd105fd96a60722e66b11e2fc0634b8"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343610 4883 scope.go:117] "RemoveContainer" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.343074 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ndt59" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346100 4883 generic.go:334] "Generic (PLEG): container finished" podID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346146 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2h5dv" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.346225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2h5dv" event={"ID":"7746695d-3e1f-455d-9acc-dffdba42c0d5","Type":"ContainerDied","Data":"7cd4d72ef0244e1c6f3955303b46c7d75041bd13eacfaf569a15ddb645d99b32"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350067 4883 generic.go:334] "Generic (PLEG): container finished" podID="816c3b00-c481-4c08-9691-0244d3c044e3" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350146 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ltgv7" event={"ID":"816c3b00-c481-4c08-9691-0244d3c044e3","Type":"ContainerDied","Data":"f5c87addac3f89a4858d25eb6fa3c57863872b10777952494e3f153096638f60"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.350174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ltgv7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352176 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "816c3b00-c481-4c08-9691-0244d3c044e3" (UID: "816c3b00-c481-4c08-9691-0244d3c044e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352897 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tlhr4" event={"ID":"fa724d40-49c8-4d1d-a7e9-5af8f0603e19","Type":"ContainerDied","Data":"bb7cd6f6d9fdd71abf19936362d7bad0ceebe476125419e2660408cc85f60192"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.352977 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tlhr4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.357666 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.358869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa724d40-49c8-4d1d-a7e9-5af8f0603e19" (UID: "fa724d40-49c8-4d1d-a7e9-5af8f0603e19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360024 4883 generic.go:334] "Generic (PLEG): container finished" podID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" exitCode=0 Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360094 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhnvt" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.360107 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhnvt" event={"ID":"740631be-94cf-4c75-a5a3-0dbd57e2e510","Type":"ContainerDied","Data":"7617eec4e807a31ae8dae401f57247ee0d7df593c7506b5c96f9dc3caf16e27a"} Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.374424 4883 scope.go:117] "RemoveContainer" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.374964 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": container with ID starting with 5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4 not found: ID does not exist" containerID="5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375009 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4"} err="failed to get container status \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": rpc error: code = NotFound desc = could not find container \"5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4\": container with ID starting with 5d54daaa5062e0dec36cd7440fadd3884c0b3fd1a82619a0cedf9cd95d551aa4 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375041 4883 scope.go:117] "RemoveContainer" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.375389 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": container with ID starting with 44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907 not found: ID does not exist" containerID="44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375431 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907"} err="failed to get container status \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": rpc error: code = NotFound desc = could not find container \"44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907\": container with ID starting with 44e0081e7b03f1c2148d02e394f9444a7d35081493706ccfc844d5538fc94907 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.375459 4883 scope.go:117] "RemoveContainer" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.383976 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389094 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2h5dv"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389513 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389545 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389559 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389572 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/816c3b00-c481-4c08-9691-0244d3c044e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389584 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lldvp\" (UniqueName: \"kubernetes.io/projected/74014fb3-ee38-481a-a27f-f12ff7f2c29a-kube-api-access-lldvp\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389595 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389607 4883 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74014fb3-ee38-481a-a27f-f12ff7f2c29a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389618 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389628 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389639 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djb46\" (UniqueName: \"kubernetes.io/projected/fa724d40-49c8-4d1d-a7e9-5af8f0603e19-kube-api-access-djb46\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389649 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvxh5\" (UniqueName: \"kubernetes.io/projected/740631be-94cf-4c75-a5a3-0dbd57e2e510-kube-api-access-xvxh5\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389660 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5m4n\" (UniqueName: \"kubernetes.io/projected/816c3b00-c481-4c08-9691-0244d3c044e3-kube-api-access-p5m4n\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389671 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7746695d-3e1f-455d-9acc-dffdba42c0d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.389682 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85hl4\" (UniqueName: \"kubernetes.io/projected/7746695d-3e1f-455d-9acc-dffdba42c0d5-kube-api-access-85hl4\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.390611 4883 scope.go:117] "RemoveContainer" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.391690 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.394171 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ndt59"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.408773 4883 scope.go:117] "RemoveContainer" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.421982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "740631be-94cf-4c75-a5a3-0dbd57e2e510" (UID: "740631be-94cf-4c75-a5a3-0dbd57e2e510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.424981 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d6jf"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.426402 4883 scope.go:117] "RemoveContainer" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.427694 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": container with ID starting with 88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125 not found: ID does not exist" containerID="88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.427739 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125"} err="failed to get container status \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": rpc error: code = NotFound desc = could not find container \"88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125\": container with ID starting with 88569bbcd03fda16b3b36b8b88fe5e83a1828510f2853c5e6b05197bece13125 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.427768 4883 scope.go:117] "RemoveContainer" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.429053 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": container with ID starting with 055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98 not found: ID does not exist" containerID="055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.429389 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98"} err="failed to get container status \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": rpc error: code = NotFound desc = could not find container \"055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98\": container with ID starting with 055920580c677f44aa23e5d363a86261c0d6b831c7011c1e97b189c129052d98 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.429789 4883 scope.go:117] "RemoveContainer" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.431259 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": container with ID starting with 8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df not found: ID does not exist" containerID="8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.431300 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df"} err="failed to get container status \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": rpc error: code = NotFound desc = could not find container \"8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df\": container with ID starting with 8263311faaef3bdc9cbc5ea07f28452be0f865eb49cb97ad3a140a15cb7538df not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.431339 4883 scope.go:117] "RemoveContainer" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.466918 4883 scope.go:117] "RemoveContainer" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.489443 4883 scope.go:117] "RemoveContainer" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.490332 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/740631be-94cf-4c75-a5a3-0dbd57e2e510-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509445 4883 scope.go:117] "RemoveContainer" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.509910 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": container with ID starting with 91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777 not found: ID does not exist" containerID="91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509942 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777"} err="failed to get container status \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": rpc error: code = NotFound desc = could not find container \"91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777\": container with ID starting with 91801b127c1f16b2d1eba5e74a1b1bf4b0449efa39652cf558f141e94a8fc777 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.509974 4883 scope.go:117] "RemoveContainer" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.510253 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": container with ID starting with a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f not found: ID does not exist" containerID="a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.510290 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f"} err="failed to get container status \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": rpc error: code = NotFound desc = could not find container \"a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f\": container with ID starting with a6ff82bc674fffd2299b24fadfa53521fadf94953271379ec9c614b2eef3bb4f not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.510315 4883 scope.go:117] "RemoveContainer" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.510994 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": container with ID starting with a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656 not found: ID does not exist" containerID="a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.511021 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656"} err="failed to get container status \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": rpc error: code = NotFound desc = could not find container \"a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656\": container with ID starting with a1fb865ae9bafc44a5d910bc59a296aa9d6edb4cb43b9e57d08352998bec2656 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.511036 4883 scope.go:117] "RemoveContainer" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.523526 4883 scope.go:117] "RemoveContainer" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.538106 4883 scope.go:117] "RemoveContainer" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.551016 4883 scope.go:117] "RemoveContainer" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.551898 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": container with ID starting with f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce not found: ID does not exist" containerID="f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.552402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce"} err="failed to get container status \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": rpc error: code = NotFound desc = could not find container \"f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce\": container with ID starting with f2b4472994708c4dd9bab560cdbdd60841027023fff08227146f3ba7dc4fe0ce not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.552434 4883 scope.go:117] "RemoveContainer" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.553297 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": container with ID starting with 4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6 not found: ID does not exist" containerID="4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553342 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6"} err="failed to get container status \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": rpc error: code = NotFound desc = could not find container \"4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6\": container with ID starting with 4852fa6417295690eda68f4c82391834294fd5ce8501889a1ad79b11699ef7c6 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553371 4883 scope.go:117] "RemoveContainer" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.553721 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": container with ID starting with 0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722 not found: ID does not exist" containerID="0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553747 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722"} err="failed to get container status \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": rpc error: code = NotFound desc = could not find container \"0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722\": container with ID starting with 0326da349c23f2cd65c001fcb9a6bd20e6c84c30c3a7c8b2c8fb9a5420eef722 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.553763 4883 scope.go:117] "RemoveContainer" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.565555 4883 scope.go:117] "RemoveContainer" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.580633 4883 scope.go:117] "RemoveContainer" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.593646 4883 scope.go:117] "RemoveContainer" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.594178 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": container with ID starting with c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40 not found: ID does not exist" containerID="c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594216 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40"} err="failed to get container status \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": rpc error: code = NotFound desc = could not find container \"c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40\": container with ID starting with c29ebab9a6c382edbfa55b0733760c488aaa0de5e64e0a509f8280e5d2e79d40 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594240 4883 scope.go:117] "RemoveContainer" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.594703 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": container with ID starting with db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461 not found: ID does not exist" containerID="db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594730 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461"} err="failed to get container status \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": rpc error: code = NotFound desc = could not find container \"db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461\": container with ID starting with db27eb7a9e29327e2f107eefbb71e1720172b07c0d9aca0c389d99a288043461 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.594746 4883 scope.go:117] "RemoveContainer" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: E0310 09:11:16.595045 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": container with ID starting with 50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7 not found: ID does not exist" containerID="50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.595071 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7"} err="failed to get container status \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": rpc error: code = NotFound desc = could not find container \"50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7\": container with ID starting with 50d34c6ad99e686a1451361b7163762c1098cda139257f6d2e615df2c22212c7 not found: ID does not exist" Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.682094 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.689991 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ltgv7"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.697770 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.702883 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vhnvt"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.709373 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:16 crc kubenswrapper[4883]: I0310 09:11:16.712761 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tlhr4"] Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" event={"ID":"849aec1a-3ce6-4153-8e52-4bf0185e29e3","Type":"ContainerStarted","Data":"65cb93f8e74db626d2881bca925a73ba4c6522ff6ae37af7a0978435830c4335"} Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369195 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" event={"ID":"849aec1a-3ce6-4153-8e52-4bf0185e29e3","Type":"ContainerStarted","Data":"0d2c937827935f30f304975b437d057b684610d1efd3234a72edd6fe96d7dbcc"} Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.369662 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.375462 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.387183 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9d6jf" podStartSLOduration=2.387159968 podStartE2EDuration="2.387159968s" podCreationTimestamp="2026-03-10 09:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:11:17.383108318 +0000 UTC m=+463.638006206" watchObservedRunningTime="2026-03-10 09:11:17.387159968 +0000 UTC m=+463.642057857" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449155 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449407 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.449518 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.450198 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.450270 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" gracePeriod=600 Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.534578 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99873383_15b6_42ee_a65f_7917294d2e02.slice/crio-conmon-dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874268 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874687 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874702 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874710 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874727 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874735 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874743 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874749 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874759 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874765 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874773 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874779 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874787 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874792 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874801 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874807 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874813 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874819 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874827 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874832 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874838 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874844 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874851 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874858 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874877 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874883 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-utilities" Mar 10 09:11:17 crc kubenswrapper[4883]: E0310 09:11:17.874891 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874897 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="extract-content" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.874993 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875004 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875016 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" containerName="marketplace-operator" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875022 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875030 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875038 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" containerName="registry-server" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.875788 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.877319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 09:11:17 crc kubenswrapper[4883]: I0310 09:11:17.884795 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.014465 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.074507 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.076143 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.081811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.089666 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74014fb3-ee38-481a-a27f-f12ff7f2c29a" path="/var/lib/kubelet/pods/74014fb3-ee38-481a-a27f-f12ff7f2c29a/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.090532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740631be-94cf-4c75-a5a3-0dbd57e2e510" path="/var/lib/kubelet/pods/740631be-94cf-4c75-a5a3-0dbd57e2e510/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.091142 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7746695d-3e1f-455d-9acc-dffdba42c0d5" path="/var/lib/kubelet/pods/7746695d-3e1f-455d-9acc-dffdba42c0d5/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.091727 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816c3b00-c481-4c08-9691-0244d3c044e3" path="/var/lib/kubelet/pods/816c3b00-c481-4c08-9691-0244d3c044e3/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.092293 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa724d40-49c8-4d1d-a7e9-5af8f0603e19" path="/var/lib/kubelet/pods/fa724d40-49c8-4d1d-a7e9-5af8f0603e19/volumes" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.097649 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.115961 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116025 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116467 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-utilities\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.116578 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/790ba2f9-1214-4040-a140-0663e2b869b1-catalog-content\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.134180 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjdw\" (UniqueName: \"kubernetes.io/projected/790ba2f9-1214-4040-a140-0663e2b869b1-kube-api-access-6bjdw\") pod \"redhat-marketplace-v6jt6\" (UID: \"790ba2f9-1214-4040-a140-0663e2b869b1\") " pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.189041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217027 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217078 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.217128 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318805 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318852 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.318906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.319922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-catalog-content\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.319938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06556553-1ab9-4217-ad98-679ff31feaf9-utilities\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.333820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42bf\" (UniqueName: \"kubernetes.io/projected/06556553-1ab9-4217-ad98-679ff31feaf9-kube-api-access-s42bf\") pod \"redhat-operators-g87df\" (UID: \"06556553-1ab9-4217-ad98-679ff31feaf9\") " pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383813 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" exitCode=0 Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383918 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a"} Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.383995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.384022 4883 scope.go:117] "RemoveContainer" containerID="3f848cd7e6ff19b9680e81ac37d539657eb0a1e640e277a0e7dc6cd1a2c443d8" Mar 10 09:11:18 crc kubenswrapper[4883]: I0310 09:11:18.393565 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:18.543253 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6jt6"] Mar 10 09:11:19 crc kubenswrapper[4883]: W0310 09:11:18.555455 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790ba2f9_1214_4040_a140_0663e2b869b1.slice/crio-00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf WatchSource:0}: Error finding container 00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf: Status 404 returned error can't find the container with id 00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:18.782374 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g87df"] Mar 10 09:11:19 crc kubenswrapper[4883]: W0310 09:11:18.788084 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06556553_1ab9_4217_ad98_679ff31feaf9.slice/crio-fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94 WatchSource:0}: Error finding container fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94: Status 404 returned error can't find the container with id fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390423 4883 generic.go:334] "Generic (PLEG): container finished" podID="790ba2f9-1214-4040-a140-0663e2b869b1" containerID="d1c1f129c5a12787ef11b631be04a777292a30f6fd8961f2da928f0c69a8fb18" exitCode=0 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerDied","Data":"d1c1f129c5a12787ef11b631be04a777292a30f6fd8961f2da928f0c69a8fb18"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.390799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"00155b5c512dae8d1115e236d8a73c737796c6264f06ea4714abd0a014db45bf"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392180 4883 generic.go:334] "Generic (PLEG): container finished" podID="06556553-1ab9-4217-ad98-679ff31feaf9" containerID="d7d62049e5d70020d0e4432a921c480056f1ddc660f983393b3160d09b750bfc" exitCode=0 Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392242 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerDied","Data":"d7d62049e5d70020d0e4432a921c480056f1ddc660f983393b3160d09b750bfc"} Mar 10 09:11:19 crc kubenswrapper[4883]: I0310 09:11:19.392273 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"fdd3d35ca65e39ed1b3a4bdf4797e16f0e4bbfc4522d57e076f72cff3456ee94"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.281165 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.282452 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.283678 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.284117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.403739 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.407144 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc"} Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449748 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.449916 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.469790 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.470769 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.474831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.479720 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.551500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.551556 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552029 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-catalog-content\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f43173ae-a262-4efa-8141-419be6d01b7d-utilities\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.552104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.568738 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbw6\" (UniqueName: \"kubernetes.io/projected/f43173ae-a262-4efa-8141-419be6d01b7d-kube-api-access-ffbw6\") pod \"certified-operators-p7kbr\" (UID: \"f43173ae-a262-4efa-8141-419be6d01b7d\") " pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.604529 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652822 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652890 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.652975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.754592 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755127 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-catalog-content\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755585 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7df241-6476-44a7-a800-921897b7e381-utilities\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.755977 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.772425 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffkw8\" (UniqueName: \"kubernetes.io/projected/8e7df241-6476-44a7-a800-921897b7e381-kube-api-access-ffkw8\") pod \"community-operators-gqg54\" (UID: \"8e7df241-6476-44a7-a800-921897b7e381\") " pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.835710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:20 crc kubenswrapper[4883]: I0310 09:11:20.954260 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7kbr"] Mar 10 09:11:20 crc kubenswrapper[4883]: W0310 09:11:20.960048 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43173ae_a262_4efa_8141_419be6d01b7d.slice/crio-abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9 WatchSource:0}: Error finding container abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9: Status 404 returned error can't find the container with id abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.207775 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqg54"] Mar 10 09:11:21 crc kubenswrapper[4883]: W0310 09:11:21.212954 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7df241_6476_44a7_a800_921897b7e381.slice/crio-80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da WatchSource:0}: Error finding container 80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da: Status 404 returned error can't find the container with id 80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.415541 4883 generic.go:334] "Generic (PLEG): container finished" podID="790ba2f9-1214-4040-a140-0663e2b869b1" containerID="b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.415652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerDied","Data":"b57e3b9dc79342cf020637a710b7a7110bc519e6f310e4e36097cf3a2ad58157"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.417506 4883 generic.go:334] "Generic (PLEG): container finished" podID="06556553-1ab9-4217-ad98-679ff31feaf9" containerID="063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.417575 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerDied","Data":"063421c6ebb33f3707f60107cb478af0622a4ad32041380fff67ec33cab7b5fc"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420032 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e7df241-6476-44a7-a800-921897b7e381" containerID="8843f3fb95d798ba8a042916ea39f2877a0b087180e5dd8510d27c412884048c" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420130 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerDied","Data":"8843f3fb95d798ba8a042916ea39f2877a0b087180e5dd8510d27c412884048c"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.420168 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"80ebfddbe5d4c043f26dda09827ea3b9f4b82570810beff570f305d8620df8da"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421549 4883 generic.go:334] "Generic (PLEG): container finished" podID="f43173ae-a262-4efa-8141-419be6d01b7d" containerID="11328ab9bfc39030bd2e5f157c6fd6fb571958651803563ee07642b9b4f4289d" exitCode=0 Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421583 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerDied","Data":"11328ab9bfc39030bd2e5f157c6fd6fb571958651803563ee07642b9b4f4289d"} Mar 10 09:11:21 crc kubenswrapper[4883]: I0310 09:11:21.421615 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerStarted","Data":"abecc8eee2fc4b23ee465ef889874fbc8a7ea8ef0ec908e1057df4645f2ebca9"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.432999 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.435572 4883 generic.go:334] "Generic (PLEG): container finished" podID="f43173ae-a262-4efa-8141-419be6d01b7d" containerID="8c9fa15df75782b50140862b92a1a092d40d1ff07e077a68309ff912f8925ab0" exitCode=0 Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.435628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerDied","Data":"8c9fa15df75782b50140862b92a1a092d40d1ff07e077a68309ff912f8925ab0"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.439378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6jt6" event={"ID":"790ba2f9-1214-4040-a140-0663e2b869b1","Type":"ContainerStarted","Data":"55de9c65aab5bd1c255a807d31442d5ee4e3aa60b789dce327cd95f8385bc6bc"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.442006 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g87df" event={"ID":"06556553-1ab9-4217-ad98-679ff31feaf9","Type":"ContainerStarted","Data":"292eae0399f0d439e5a73bfdd16a81dd3749c12520ff10d70fcc84b78ed738df"} Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.468552 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g87df" podStartSLOduration=1.748040482 podStartE2EDuration="4.46853128s" podCreationTimestamp="2026-03-10 09:11:18 +0000 UTC" firstStartedPulling="2026-03-10 09:11:19.393908612 +0000 UTC m=+465.648806501" lastFinishedPulling="2026-03-10 09:11:22.114399409 +0000 UTC m=+468.369297299" observedRunningTime="2026-03-10 09:11:22.464664128 +0000 UTC m=+468.719562017" watchObservedRunningTime="2026-03-10 09:11:22.46853128 +0000 UTC m=+468.723429169" Mar 10 09:11:22 crc kubenswrapper[4883]: I0310 09:11:22.499550 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6jt6" podStartSLOduration=2.906578505 podStartE2EDuration="5.499533035s" podCreationTimestamp="2026-03-10 09:11:17 +0000 UTC" firstStartedPulling="2026-03-10 09:11:19.392748715 +0000 UTC m=+465.647646603" lastFinishedPulling="2026-03-10 09:11:21.985703243 +0000 UTC m=+468.240601133" observedRunningTime="2026-03-10 09:11:22.495392307 +0000 UTC m=+468.750290207" watchObservedRunningTime="2026-03-10 09:11:22.499533035 +0000 UTC m=+468.754430925" Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.452380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7kbr" event={"ID":"f43173ae-a262-4efa-8141-419be6d01b7d","Type":"ContainerStarted","Data":"fd5bfa6abee8295036504b162c76c120d8beadf3fd59e0557255941439737175"} Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.456207 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e7df241-6476-44a7-a800-921897b7e381" containerID="ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805" exitCode=0 Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.456257 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerDied","Data":"ba8fc165cfc7a7c565522559d94192a1515f7552c9f0775863a01e383f590805"} Mar 10 09:11:23 crc kubenswrapper[4883]: I0310 09:11:23.473273 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7kbr" podStartSLOduration=1.945969673 podStartE2EDuration="3.473252081s" podCreationTimestamp="2026-03-10 09:11:20 +0000 UTC" firstStartedPulling="2026-03-10 09:11:21.423098062 +0000 UTC m=+467.677995951" lastFinishedPulling="2026-03-10 09:11:22.95038047 +0000 UTC m=+469.205278359" observedRunningTime="2026-03-10 09:11:23.472379516 +0000 UTC m=+469.727277405" watchObservedRunningTime="2026-03-10 09:11:23.473252081 +0000 UTC m=+469.728149970" Mar 10 09:11:24 crc kubenswrapper[4883]: I0310 09:11:24.463708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqg54" event={"ID":"8e7df241-6476-44a7-a800-921897b7e381","Type":"ContainerStarted","Data":"5aa6d0e2366b6a6450d2e13b1ecaf223b1e41665d4ed3461495ce272366c692c"} Mar 10 09:11:24 crc kubenswrapper[4883]: I0310 09:11:24.485629 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqg54" podStartSLOduration=1.93511128 podStartE2EDuration="4.48560823s" podCreationTimestamp="2026-03-10 09:11:20 +0000 UTC" firstStartedPulling="2026-03-10 09:11:21.421167382 +0000 UTC m=+467.676065271" lastFinishedPulling="2026-03-10 09:11:23.971664332 +0000 UTC m=+470.226562221" observedRunningTime="2026-03-10 09:11:24.481649415 +0000 UTC m=+470.736547305" watchObservedRunningTime="2026-03-10 09:11:24.48560823 +0000 UTC m=+470.740506119" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.189821 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.190587 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.224776 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.394132 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.394209 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.427710 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.516169 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g87df" Mar 10 09:11:28 crc kubenswrapper[4883]: I0310 09:11:28.516627 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6jt6" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.605059 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.605205 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.637569 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.835983 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.836054 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:30 crc kubenswrapper[4883]: I0310 09:11:30.869055 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.536853 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7kbr" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.540085 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqg54" Mar 10 09:11:31 crc kubenswrapper[4883]: I0310 09:11:31.698270 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" containerID="cri-o://7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" gracePeriod=30 Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.044314 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208242 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208333 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208390 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.208420 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") pod \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\" (UID: \"7bfdcb1d-e416-438c-9916-5c42cf35f2eb\") " Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.210014 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.210108 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh" (OuterVolumeSpecName: "kube-api-access-q9wjh") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "kube-api-access-q9wjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217466 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217805 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.217991 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.223870 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.231610 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7bfdcb1d-e416-438c-9916-5c42cf35f2eb" (UID: "7bfdcb1d-e416-438c-9916-5c42cf35f2eb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311061 4883 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311110 4883 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311128 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9wjh\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-kube-api-access-q9wjh\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311140 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311151 4883 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311160 4883 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.311169 4883 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7bfdcb1d-e416-438c-9916-5c42cf35f2eb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512456 4883 generic.go:334] "Generic (PLEG): container finished" podID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" exitCode=0 Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512609 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.512609 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerDied","Data":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.513108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jv69n" event={"ID":"7bfdcb1d-e416-438c-9916-5c42cf35f2eb","Type":"ContainerDied","Data":"ba9d597cdd4e690659606d934bb4d1fb3e310147327af93f1ac8149f438281d6"} Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.513142 4883 scope.go:117] "RemoveContainer" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.536564 4883 scope.go:117] "RemoveContainer" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: E0310 09:11:32.537186 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": container with ID starting with 7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431 not found: ID does not exist" containerID="7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.537223 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431"} err="failed to get container status \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": rpc error: code = NotFound desc = could not find container \"7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431\": container with ID starting with 7cbe641f7ce474ffff76c26085e2c5587219e01326162c10dfc88c1169075431 not found: ID does not exist" Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.558549 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:32 crc kubenswrapper[4883]: I0310 09:11:32.561534 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jv69n"] Mar 10 09:11:34 crc kubenswrapper[4883]: I0310 09:11:34.087615 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" path="/var/lib/kubelet/pods/7bfdcb1d-e416-438c-9916-5c42cf35f2eb/volumes" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.133750 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:00 crc kubenswrapper[4883]: E0310 09:12:00.134538 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.134554 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.134661 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bfdcb1d-e416-438c-9916-5c42cf35f2eb" containerName="registry" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.135084 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137660 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137799 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.137919 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.139773 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.226140 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.327790 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.346536 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"auto-csr-approver-29552232-d429x\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.447930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:00 crc kubenswrapper[4883]: I0310 09:12:00.802741 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:12:01 crc kubenswrapper[4883]: I0310 09:12:01.672747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerStarted","Data":"38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79"} Mar 10 09:12:02 crc kubenswrapper[4883]: I0310 09:12:02.680618 4883 generic.go:334] "Generic (PLEG): container finished" podID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerID="49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00" exitCode=0 Mar 10 09:12:02 crc kubenswrapper[4883]: I0310 09:12:02.680708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerDied","Data":"49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00"} Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.866546 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.973101 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") pod \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\" (UID: \"bbf9a36e-c0e2-4943-a87c-9f6735b2714e\") " Mar 10 09:12:03 crc kubenswrapper[4883]: I0310 09:12:03.978893 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9" (OuterVolumeSpecName: "kube-api-access-5vxd9") pod "bbf9a36e-c0e2-4943-a87c-9f6735b2714e" (UID: "bbf9a36e-c0e2-4943-a87c-9f6735b2714e"). InnerVolumeSpecName "kube-api-access-5vxd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.074682 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vxd9\" (UniqueName: \"kubernetes.io/projected/bbf9a36e-c0e2-4943-a87c-9f6735b2714e-kube-api-access-5vxd9\") on node \"crc\" DevicePath \"\"" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694856 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-d429x" event={"ID":"bbf9a36e-c0e2-4943-a87c-9f6735b2714e","Type":"ContainerDied","Data":"38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79"} Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694932 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38121c79b8ef164512014cedd9718eef5e4d30811a6036a632888659585fca79" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.694955 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-d429x" Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.914976 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:12:04 crc kubenswrapper[4883]: I0310 09:12:04.919168 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-jp7d9"] Mar 10 09:12:06 crc kubenswrapper[4883]: I0310 09:12:06.085081 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632d4971-be4e-4939-a46a-42604b182436" path="/var/lib/kubelet/pods/632d4971-be4e-4939-a46a-42604b182436/volumes" Mar 10 09:13:17 crc kubenswrapper[4883]: I0310 09:13:17.449468 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:17 crc kubenswrapper[4883]: I0310 09:13:17.449909 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:13:34 crc kubenswrapper[4883]: I0310 09:13:34.294052 4883 scope.go:117] "RemoveContainer" containerID="4f14915d4656a1c4b614a36f6c062aeb515069a6fa76ae902ddf80e353fe48d5" Mar 10 09:13:47 crc kubenswrapper[4883]: I0310 09:13:47.448963 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:47 crc kubenswrapper[4883]: I0310 09:13:47.449297 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124178 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: E0310 09:14:00.124783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124796 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.124923 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.125300 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.126873 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.126919 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.127001 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.129044 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.263462 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.364587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.379295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"auto-csr-approver-29552234-ftnh5\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.441402 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.771610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:14:00 crc kubenswrapper[4883]: I0310 09:14:00.778115 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:14:01 crc kubenswrapper[4883]: I0310 09:14:01.277847 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerStarted","Data":"8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560"} Mar 10 09:14:02 crc kubenswrapper[4883]: I0310 09:14:02.288077 4883 generic.go:334] "Generic (PLEG): container finished" podID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerID="3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f" exitCode=0 Mar 10 09:14:02 crc kubenswrapper[4883]: I0310 09:14:02.288286 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerDied","Data":"3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f"} Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.444807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.598718 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") pod \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\" (UID: \"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9\") " Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.603722 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm" (OuterVolumeSpecName: "kube-api-access-48gqm") pod "80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" (UID: "80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9"). InnerVolumeSpecName "kube-api-access-48gqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:14:03 crc kubenswrapper[4883]: I0310 09:14:03.699940 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48gqm\" (UniqueName: \"kubernetes.io/projected/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9-kube-api-access-48gqm\") on node \"crc\" DevicePath \"\"" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298651 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" event={"ID":"80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9","Type":"ContainerDied","Data":"8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560"} Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298693 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8528f2157f0acd32328646f1dce7be45a561faceb1ed76a0c35c69cc8d4bd560" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.298752 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-ftnh5" Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.485365 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:14:04 crc kubenswrapper[4883]: I0310 09:14:04.490800 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-kn7mm"] Mar 10 09:14:06 crc kubenswrapper[4883]: I0310 09:14:06.085272 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07643a3-c0a9-4770-a08e-ab4fb32dfe8e" path="/var/lib/kubelet/pods/c07643a3-c0a9-4770-a08e-ab4fb32dfe8e/volumes" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449522 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449729 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.449765 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.450156 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:14:17 crc kubenswrapper[4883]: I0310 09:14:17.450213 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" gracePeriod=600 Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365270 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" exitCode=0 Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365335 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13"} Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365698 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} Mar 10 09:14:18 crc kubenswrapper[4883]: I0310 09:14:18.365724 4883 scope.go:117] "RemoveContainer" containerID="dfccfd3d5d8560b851e16244b5ed951fe4fddcc169bfaf264619a37590b9592a" Mar 10 09:14:34 crc kubenswrapper[4883]: I0310 09:14:34.336200 4883 scope.go:117] "RemoveContainer" containerID="8d2862eee27c865a5680228f73b67899d38c264111c25020319e7ec39c7a9c80" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.129182 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:00 crc kubenswrapper[4883]: E0310 09:15:00.130205 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130228 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130344 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.130868 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.134118 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.134249 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.140210 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236302 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236358 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.236428 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337843 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337926 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.337994 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.338927 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.343791 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.353442 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"collect-profiles-29552235-9wd67\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.446341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:00 crc kubenswrapper[4883]: I0310 09:15:00.598546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563103 4883 generic.go:334] "Generic (PLEG): container finished" podID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerID="1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56" exitCode=0 Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563215 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerDied","Data":"1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56"} Mar 10 09:15:01 crc kubenswrapper[4883]: I0310 09:15:01.563526 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerStarted","Data":"045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd"} Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.737395 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868709 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868888 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.868926 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") pod \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\" (UID: \"e16ab2a6-c8ca-4487-b42f-381f61d18ba0\") " Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.869577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume" (OuterVolumeSpecName: "config-volume") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.874689 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb" (OuterVolumeSpecName: "kube-api-access-bn6jb") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "kube-api-access-bn6jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.874731 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e16ab2a6-c8ca-4487-b42f-381f61d18ba0" (UID: "e16ab2a6-c8ca-4487-b42f-381f61d18ba0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970495 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn6jb\" (UniqueName: \"kubernetes.io/projected/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-kube-api-access-bn6jb\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:02 crc kubenswrapper[4883]: I0310 09:15:02.970557 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16ab2a6-c8ca-4487-b42f-381f61d18ba0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580382 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" event={"ID":"e16ab2a6-c8ca-4487-b42f-381f61d18ba0","Type":"ContainerDied","Data":"045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd"} Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580711 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045f88345a3bdabd64b98dc407d9fd728b754e1365158cfc5d7267e108746abd" Mar 10 09:15:03 crc kubenswrapper[4883]: I0310 09:15:03.580450 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.103635 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: E0310 09:15:53.104389 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104404 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104510 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" containerName="collect-profiles" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.104860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106416 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jkcv6" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.106861 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.107460 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.108114 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.111025 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qw25n" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.117154 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.117789 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.120070 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.120321 4883 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vstwv" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.130589 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.142964 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292758 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.292913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394226 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.394370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.412521 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtm7\" (UniqueName: \"kubernetes.io/projected/f33cf1b9-ce0d-41f4-8f36-1b159badc41e-kube-api-access-pdtm7\") pod \"cert-manager-webhook-687f57d79b-dfhh4\" (UID: \"f33cf1b9-ce0d-41f4-8f36-1b159badc41e\") " pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.413172 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrck\" (UniqueName: \"kubernetes.io/projected/1c0c9250-e9df-4898-bd0e-91919353a3f6-kube-api-access-8wrck\") pod \"cert-manager-858654f9db-kl2rd\" (UID: \"1c0c9250-e9df-4898-bd0e-91919353a3f6\") " pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.413602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpfv\" (UniqueName: \"kubernetes.io/projected/b92cb5d0-214a-49a6-b9b7-f210fef36956-kube-api-access-kbpfv\") pod \"cert-manager-cainjector-cf98fcc89-n2g9x\" (UID: \"b92cb5d0-214a-49a6-b9b7-f210fef36956\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.431053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.435836 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-kl2rd" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.452725 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.803655 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-kl2rd"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.828186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kl2rd" event={"ID":"1c0c9250-e9df-4898-bd0e-91919353a3f6","Type":"ContainerStarted","Data":"da4150833c09bc36a85adf0d73a58273a6d0cb80e6fb6f941d39fb274363c625"} Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.860272 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x"] Mar 10 09:15:53 crc kubenswrapper[4883]: I0310 09:15:53.863235 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-dfhh4"] Mar 10 09:15:53 crc kubenswrapper[4883]: W0310 09:15:53.870540 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92cb5d0_214a_49a6_b9b7_f210fef36956.slice/crio-0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc WatchSource:0}: Error finding container 0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc: Status 404 returned error can't find the container with id 0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc Mar 10 09:15:53 crc kubenswrapper[4883]: W0310 09:15:53.877144 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33cf1b9_ce0d_41f4_8f36_1b159badc41e.slice/crio-75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6 WatchSource:0}: Error finding container 75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6: Status 404 returned error can't find the container with id 75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6 Mar 10 09:15:54 crc kubenswrapper[4883]: I0310 09:15:54.837097 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" event={"ID":"b92cb5d0-214a-49a6-b9b7-f210fef36956","Type":"ContainerStarted","Data":"0677f78f55af282e718229d75064158e8f521e4c0f5b72cc42c24b612ba336dc"} Mar 10 09:15:54 crc kubenswrapper[4883]: I0310 09:15:54.839152 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" event={"ID":"f33cf1b9-ce0d-41f4-8f36-1b159badc41e","Type":"ContainerStarted","Data":"75f4248a44d760e9938375529d3567bc2fa003942302e375d40b588cc724aae6"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.857783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" event={"ID":"f33cf1b9-ce0d-41f4-8f36-1b159badc41e","Type":"ContainerStarted","Data":"769d2ee506b4abc0cac7ad309291e5fbe8da836ead0f52d1f5554cc66357937d"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.858653 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.861177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" event={"ID":"b92cb5d0-214a-49a6-b9b7-f210fef36956","Type":"ContainerStarted","Data":"8b993c6c5ec7a3d0bb2e90d797bbba40df33b5da3a486c51533ace44b82d27f2"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.863353 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-kl2rd" event={"ID":"1c0c9250-e9df-4898-bd0e-91919353a3f6","Type":"ContainerStarted","Data":"75027be737cadc92b4bd3b4acd45f4ab16f2ec61d887b30a54d742f4c639e91a"} Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.878466 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" podStartSLOduration=1.9734419779999999 podStartE2EDuration="4.878435299s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.879309652 +0000 UTC m=+740.134207541" lastFinishedPulling="2026-03-10 09:15:56.784302963 +0000 UTC m=+743.039200862" observedRunningTime="2026-03-10 09:15:57.872489788 +0000 UTC m=+744.127387677" watchObservedRunningTime="2026-03-10 09:15:57.878435299 +0000 UTC m=+744.133333188" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.889597 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-n2g9x" podStartSLOduration=2.00391127 podStartE2EDuration="4.889589111s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.879336362 +0000 UTC m=+740.134234251" lastFinishedPulling="2026-03-10 09:15:56.765014213 +0000 UTC m=+743.019912092" observedRunningTime="2026-03-10 09:15:57.888163192 +0000 UTC m=+744.143061080" watchObservedRunningTime="2026-03-10 09:15:57.889589111 +0000 UTC m=+744.144487000" Mar 10 09:15:57 crc kubenswrapper[4883]: I0310 09:15:57.902657 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-kl2rd" podStartSLOduration=1.9541161790000001 podStartE2EDuration="4.902632065s" podCreationTimestamp="2026-03-10 09:15:53 +0000 UTC" firstStartedPulling="2026-03-10 09:15:53.810916541 +0000 UTC m=+740.065814430" lastFinishedPulling="2026-03-10 09:15:56.759432427 +0000 UTC m=+743.014330316" observedRunningTime="2026-03-10 09:15:57.90190361 +0000 UTC m=+744.156801500" watchObservedRunningTime="2026-03-10 09:15:57.902632065 +0000 UTC m=+744.157529954" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.154679 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.155890 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.158638 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.158639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.168489 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.170766 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.278147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.379594 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.401268 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"auto-csr-approver-29552236-hdd6d\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.471051 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.845133 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:16:00 crc kubenswrapper[4883]: W0310 09:16:00.850685 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a502d2_d219_4f01_aebc_f27fb7766458.slice/crio-266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6 WatchSource:0}: Error finding container 266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6: Status 404 returned error can't find the container with id 266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6 Mar 10 09:16:00 crc kubenswrapper[4883]: I0310 09:16:00.878023 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerStarted","Data":"266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6"} Mar 10 09:16:02 crc kubenswrapper[4883]: I0310 09:16:02.891907 4883 generic.go:334] "Generic (PLEG): container finished" podID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerID="3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99" exitCode=0 Mar 10 09:16:02 crc kubenswrapper[4883]: I0310 09:16:02.892055 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerDied","Data":"3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99"} Mar 10 09:16:03 crc kubenswrapper[4883]: I0310 09:16:03.455779 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-dfhh4" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.099675 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.224946 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") pod \"a2a502d2-d219-4f01-aebc-f27fb7766458\" (UID: \"a2a502d2-d219-4f01-aebc-f27fb7766458\") " Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.230003 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb" (OuterVolumeSpecName: "kube-api-access-wxthb") pod "a2a502d2-d219-4f01-aebc-f27fb7766458" (UID: "a2a502d2-d219-4f01-aebc-f27fb7766458"). InnerVolumeSpecName "kube-api-access-wxthb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.325813 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxthb\" (UniqueName: \"kubernetes.io/projected/a2a502d2-d219-4f01-aebc-f27fb7766458-kube-api-access-wxthb\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" event={"ID":"a2a502d2-d219-4f01-aebc-f27fb7766458","Type":"ContainerDied","Data":"266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6"} Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904398 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="266194f6cd77ef22f44bcb6d3a70c2b16552838159f0545d6f2a27f468893ce6" Mar 10 09:16:04 crc kubenswrapper[4883]: I0310 09:16:04.904107 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-hdd6d" Mar 10 09:16:05 crc kubenswrapper[4883]: I0310 09:16:05.142533 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:16:05 crc kubenswrapper[4883]: I0310 09:16:05.144275 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-7n2zm"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.005720 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006356 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" containerID="cri-o://548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006514 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" containerID="cri-o://08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006416 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" containerID="cri-o://d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006630 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" containerID="cri-o://836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006601 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" containerID="cri-o://39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006667 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" containerID="cri-o://33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.006512 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.042423 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" containerID="cri-o://afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" gracePeriod=30 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.085189 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e54057e-1436-403c-bd92-66dbb888b129" path="/var/lib/kubelet/pods/9e54057e-1436-403c-bd92-66dbb888b129/volumes" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.267865 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.270850 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-acl-logging/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.271511 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-controller/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.272034 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.321846 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xbrjj"] Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322154 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322173 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322182 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322190 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322198 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kubecfg-setup" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322204 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kubecfg-setup" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322212 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322218 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322230 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322236 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322242 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322247 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322258 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322263 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322271 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322276 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322285 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322290 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322299 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322305 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322312 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322318 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322326 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322332 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322345 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322351 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322496 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322507 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322518 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="kube-rbac-proxy-node" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322528 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322538 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322546 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="northd" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322554 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322561 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322567 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" containerName="oc" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322574 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovn-acl-logging" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322582 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="sbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322590 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="nbdb" Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.322709 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322716 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.322835 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerName="ovnkube-controller" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.324547 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348401 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348461 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348502 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348524 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348687 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348723 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348742 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348761 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348787 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348804 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348819 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348836 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348860 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348934 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348956 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348970 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.348989 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349013 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349050 4883 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.349606 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.355753 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.363587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449769 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449806 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449837 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449889 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449897 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449936 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log" (OuterVolumeSpecName: "node-log") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449961 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.449974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450011 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450031 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450050 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450073 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450106 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450138 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450114 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450203 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450217 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket" (OuterVolumeSpecName: "log-socket") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") pod \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\" (UID: \"fc928c48-1df8-4c31-986e-eba2aa7a1c0b\") " Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450247 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450255 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash" (OuterVolumeSpecName: "host-slash") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450328 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450346 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450371 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450508 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450535 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450551 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-kubelet\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450575 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-log-socket\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450639 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450664 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450688 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-etc-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450714 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-ovn\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450961 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.450979 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451366 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451447 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-var-lib-openvswitch\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451557 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451603 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-systemd-units\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451756 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451789 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-node-log\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451860 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-script-lib\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-run-systemd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.451951 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-netd\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452025 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452071 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452144 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-env-overrides\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452258 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452232 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-slash\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452292 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-netns\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452329 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-cni-bin\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a4989a-9fbe-41de-be68-4377681f9fd6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452514 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452535 4883 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452549 4883 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452564 4883 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452576 4883 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452587 4883 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452599 4883 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452613 4883 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452625 4883 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452636 4883 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452646 4883 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452646 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovnkube-config\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452657 4883 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452702 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452716 4883 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452735 4883 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452751 4883 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452760 4883 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.452772 4883 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.454040 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a4989a-9fbe-41de-be68-4377681f9fd6-ovn-node-metrics-cert\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.454051 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5" (OuterVolumeSpecName: "kube-api-access-h98t5") pod "fc928c48-1df8-4c31-986e-eba2aa7a1c0b" (UID: "fc928c48-1df8-4c31-986e-eba2aa7a1c0b"). InnerVolumeSpecName "kube-api-access-h98t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.467459 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llmz\" (UniqueName: \"kubernetes.io/projected/a1a4989a-9fbe-41de-be68-4377681f9fd6-kube-api-access-8llmz\") pod \"ovnkube-node-xbrjj\" (UID: \"a1a4989a-9fbe-41de-be68-4377681f9fd6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.554343 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h98t5\" (UniqueName: \"kubernetes.io/projected/fc928c48-1df8-4c31-986e-eba2aa7a1c0b-kube-api-access-h98t5\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.638033 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:06 crc kubenswrapper[4883]: W0310 09:16:06.659597 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a4989a_9fbe_41de_be68_4377681f9fd6.slice/crio-475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c WatchSource:0}: Error finding container 475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c: Status 404 returned error can't find the container with id 475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.918268 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovnkube-controller/3.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.921196 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-acl-logging/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.921763 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pzdml_fc928c48-1df8-4c31-986e-eba2aa7a1c0b/ovn-controller/0.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922216 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922246 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922254 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922263 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922271 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922277 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922287 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" exitCode=143 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922295 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" exitCode=143 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922337 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922375 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922403 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922418 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922431 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922431 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922555 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922564 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922571 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922578 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922584 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922591 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922598 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922604 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922614 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922626 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922633 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922639 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922644 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922650 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922656 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922663 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922670 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922677 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922683 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922703 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922709 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922716 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922723 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922731 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922737 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922743 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922748 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922754 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922758 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922766 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pzdml" event={"ID":"fc928c48-1df8-4c31-986e-eba2aa7a1c0b","Type":"ContainerDied","Data":"f3b20c822aee97242cfd38a934b97331876c32d69d19440940706ed8c35c887b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922775 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922781 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922786 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922791 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922796 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922802 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922807 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922814 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922819 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.922825 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.925797 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/2.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926297 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/1.log" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926330 4883 generic.go:334] "Generic (PLEG): container finished" podID="8e883c29-520e-4b1f-b49c-3df10450d467" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" exitCode=2 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926399 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerDied","Data":"5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.926441 4883 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.927429 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928825 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1a4989a-9fbe-41de-be68-4377681f9fd6" containerID="e8f9377905675bb42c0dd73b121c12a4cf939452f080a77a60656a017a2c06e0" exitCode=0 Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928882 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerDied","Data":"e8f9377905675bb42c0dd73b121c12a4cf939452f080a77a60656a017a2c06e0"} Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.928937 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"475407723f6183c9255a0a3acf38f084fe69309ffac7a14a4e73d2dfd3c5ac0c"} Mar 10 09:16:06 crc kubenswrapper[4883]: E0310 09:16:06.929427 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.951575 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.958021 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.960697 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pzdml"] Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.968238 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.980807 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:06 crc kubenswrapper[4883]: I0310 09:16:06.992440 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.015721 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.030471 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.044101 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.058967 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.078546 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092555 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.092839 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092880 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.092919 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.093249 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093279 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093303 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.093647 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093673 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.093688 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094008 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094030 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094043 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094377 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094420 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.094734 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094755 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.094809 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095061 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095082 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095104 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095374 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095400 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095415 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.095703 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095745 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.095759 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: E0310 09:16:07.096090 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096117 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096134 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096396 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096419 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096674 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096692 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.096970 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097225 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097242 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097461 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097493 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097740 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097758 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097971 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.097987 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098235 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098254 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098498 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098519 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098746 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098763 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.098997 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099022 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099301 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099333 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099589 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.099608 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100051 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100072 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100352 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100371 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100663 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100685 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100940 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.100958 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101229 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101247 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101439 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101456 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101681 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101703 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101975 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.101997 4883 scope.go:117] "RemoveContainer" containerID="156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102220 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a"} err="failed to get container status \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": rpc error: code = NotFound desc = could not find container \"156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a\": container with ID starting with 156ee9f23ce9b3a351545a380acf001c04389edd500c4713ef2471a8a4850a7a not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102239 4883 scope.go:117] "RemoveContainer" containerID="33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102516 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651"} err="failed to get container status \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": rpc error: code = NotFound desc = could not find container \"33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651\": container with ID starting with 33b46753da242badc74c631e424bbbffa8654e7d6dd8ff0afbecec03f782b651 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102540 4883 scope.go:117] "RemoveContainer" containerID="d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102739 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b"} err="failed to get container status \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": rpc error: code = NotFound desc = could not find container \"d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b\": container with ID starting with d426265ce8dc7d7891da101dbb294bf47d963f060a0dd9c0bc57a692e75fd88b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102760 4883 scope.go:117] "RemoveContainer" containerID="08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.102996 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7"} err="failed to get container status \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": rpc error: code = NotFound desc = could not find container \"08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7\": container with ID starting with 08dbe9d4484d0533f184c7a6e3d6488b541e81be4f8d20ad656071976e66d3a7 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103013 4883 scope.go:117] "RemoveContainer" containerID="7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103246 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5"} err="failed to get container status \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": rpc error: code = NotFound desc = could not find container \"7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5\": container with ID starting with 7ec3d1f5457c523a49f751fbc33382db23710bda4f34d1ad22a13792a68561e5 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103266 4883 scope.go:117] "RemoveContainer" containerID="836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103633 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b"} err="failed to get container status \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": rpc error: code = NotFound desc = could not find container \"836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b\": container with ID starting with 836624983b1625963fe8dd9c243934237a1a41e2da32cad6db2396512880d55b not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103659 4883 scope.go:117] "RemoveContainer" containerID="39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103962 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef"} err="failed to get container status \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": rpc error: code = NotFound desc = could not find container \"39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef\": container with ID starting with 39e232cfe43d01fb72a140919e11e281f9be061493359321fb23f43b0d7614ef not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.103981 4883 scope.go:117] "RemoveContainer" containerID="548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104270 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531"} err="failed to get container status \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": rpc error: code = NotFound desc = could not find container \"548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531\": container with ID starting with 548af89bdaf8399b9dadb8234810c550e5e4642ffee1f165861a354c1582e531 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104291 4883 scope.go:117] "RemoveContainer" containerID="131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104514 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca"} err="failed to get container status \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": rpc error: code = NotFound desc = could not find container \"131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca\": container with ID starting with 131f21e771a90787fa0c65e82d3b9b0ce2b45a13d5530c46f7edff256bec46ca not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104535 4883 scope.go:117] "RemoveContainer" containerID="afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.104755 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961"} err="failed to get container status \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": rpc error: code = NotFound desc = could not find container \"afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961\": container with ID starting with afb85b40eacaa1679759c0dc60c7b99fcb3266f8403b813755bae2eb28109961 not found: ID does not exist" Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.939621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"ef18237781cf3eb2b10d931e7aee279eb2a8cc0a999469990928f6bbe7f4b0c2"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.939993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"fdc4eb9ef955c2643f68207ef59ea1567ab6b0f96683426e45ffff4c4d7fcdf7"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940007 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"674e6c1129630fdafe78cd8f2534e7135163fdd79f70b5f0c022fd4ac6f7071a"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"74a755b6f42093cc0389ec104591dbfc9802ef4f52b3e24519cde248bce23bb0"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940027 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"e3f622dc1c73a5d7d1ba16b7dc9a8be7459f672e58cd877c2d07bb363180f22e"} Mar 10 09:16:07 crc kubenswrapper[4883]: I0310 09:16:07.940035 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"d597f39ccfeea416934eb2235d29690bb961304ca221b81975823d2f3d155b6a"} Mar 10 09:16:08 crc kubenswrapper[4883]: I0310 09:16:08.089004 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc928c48-1df8-4c31-986e-eba2aa7a1c0b" path="/var/lib/kubelet/pods/fc928c48-1df8-4c31-986e-eba2aa7a1c0b/volumes" Mar 10 09:16:09 crc kubenswrapper[4883]: I0310 09:16:09.956048 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"46c4372107ba29a84a898ab5768567eb2230ed222c911c3e103d84a58fbb5627"} Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975189 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" event={"ID":"a1a4989a-9fbe-41de-be68-4377681f9fd6","Type":"ContainerStarted","Data":"b1d402380d7cad480196967114f6270018a83ac6af0e3ec4056ea5c3461b3c75"} Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975752 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975785 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:11 crc kubenswrapper[4883]: I0310 09:16:11.975808 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.005402 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.010193 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:12 crc kubenswrapper[4883]: I0310 09:16:12.016894 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" podStartSLOduration=6.016879561 podStartE2EDuration="6.016879561s" podCreationTimestamp="2026-03-10 09:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:16:12.011968531 +0000 UTC m=+758.266866420" watchObservedRunningTime="2026-03-10 09:16:12.016879561 +0000 UTC m=+758.271777450" Mar 10 09:16:17 crc kubenswrapper[4883]: I0310 09:16:17.449268 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:16:17 crc kubenswrapper[4883]: I0310 09:16:17.450075 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:16:20 crc kubenswrapper[4883]: I0310 09:16:20.079674 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:20 crc kubenswrapper[4883]: E0310 09:16:20.080276 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-p898z_openshift-multus(8e883c29-520e-4b1f-b49c-3df10450d467)\"" pod="openshift-multus/multus-p898z" podUID="8e883c29-520e-4b1f-b49c-3df10450d467" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.264668 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.266895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.269040 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.276976 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.366893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.366981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.367012 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468435 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468540 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.468567 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.469059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.469106 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.486466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: I0310 09:16:33.586111 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619045 4883 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619146 4883 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619179 4883 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:33 crc kubenswrapper[4883]: E0310 09:16:33.619253 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(19b0cf1042716c8df6c003917583506f8b407b2f7eca648890497c4cd0bd90fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.082697 4883 scope.go:117] "RemoveContainer" containerID="5ff25fea28b07754fa577f6bb631a6166978ad82f449fba3657e3971a580bd6d" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.092963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.093301 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122196 4883 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122265 4883 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122296 4883 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:34 crc kubenswrapper[4883]: E0310 09:16:34.122352 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace(eb61f8a4-ceed-4f2a-91fe-ead52fb416ee)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_openshift-marketplace_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee_0(d7c0f66777e5209e670ab0846956bba98b9fc5ab89cac08b1f0c3a0fa2ac1126): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.380335 4883 scope.go:117] "RemoveContainer" containerID="7590615cba141d2df532a5f2b91dc13b678e9424c198e88d350b709d2d0d8639" Mar 10 09:16:34 crc kubenswrapper[4883]: I0310 09:16:34.408003 4883 scope.go:117] "RemoveContainer" containerID="498e50de029e36d32b6d4b227b801583bc911f15ed93811e369c23b7504e07a0" Mar 10 09:16:35 crc kubenswrapper[4883]: I0310 09:16:35.103160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p898z_8e883c29-520e-4b1f-b49c-3df10450d467/kube-multus/2.log" Mar 10 09:16:35 crc kubenswrapper[4883]: I0310 09:16:35.103244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p898z" event={"ID":"8e883c29-520e-4b1f-b49c-3df10450d467","Type":"ContainerStarted","Data":"49260ae22d078e2736178f52f24f2210a2279bcdd98e9c39923978aa4fc77ff2"} Mar 10 09:16:36 crc kubenswrapper[4883]: I0310 09:16:36.660962 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xbrjj" Mar 10 09:16:47 crc kubenswrapper[4883]: I0310 09:16:47.448834 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:16:47 crc kubenswrapper[4883]: I0310 09:16:47.449620 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.079285 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.079858 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:48 crc kubenswrapper[4883]: I0310 09:16:48.246036 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd"] Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184249 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="36e8ac20e381043c167658371a35a780725504424d32e501e74c3d910453459d" exitCode=0 Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184345 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"36e8ac20e381043c167658371a35a780725504424d32e501e74c3d910453459d"} Mar 10 09:16:49 crc kubenswrapper[4883]: I0310 09:16:49.184707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerStarted","Data":"5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038"} Mar 10 09:16:51 crc kubenswrapper[4883]: I0310 09:16:51.198869 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="9e7e07ed48b3e382b8d19e5c933331df54943b9bd98226dea20b61c50c91eb15" exitCode=0 Mar 10 09:16:51 crc kubenswrapper[4883]: I0310 09:16:51.198924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"9e7e07ed48b3e382b8d19e5c933331df54943b9bd98226dea20b61c50c91eb15"} Mar 10 09:16:52 crc kubenswrapper[4883]: I0310 09:16:52.206913 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerID="0f2feaf6b098d019f4e99f5790974e9d1121e3e582a8163ac9049bf43eeb604b" exitCode=0 Mar 10 09:16:52 crc kubenswrapper[4883]: I0310 09:16:52.206997 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"0f2feaf6b098d019f4e99f5790974e9d1121e3e582a8163ac9049bf43eeb604b"} Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.412871 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585786 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.585958 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") pod \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\" (UID: \"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee\") " Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.588192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle" (OuterVolumeSpecName: "bundle") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.601316 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz" (OuterVolumeSpecName: "kube-api-access-ccdjz") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "kube-api-access-ccdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.606816 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util" (OuterVolumeSpecName: "util") pod "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" (UID: "eb61f8a4-ceed-4f2a-91fe-ead52fb416ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.687917 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.688242 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccdjz\" (UniqueName: \"kubernetes.io/projected/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-kube-api-access-ccdjz\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:53 crc kubenswrapper[4883]: I0310 09:16:53.688339 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb61f8a4-ceed-4f2a-91fe-ead52fb416ee-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220872 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" event={"ID":"eb61f8a4-ceed-4f2a-91fe-ead52fb416ee","Type":"ContainerDied","Data":"5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038"} Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220925 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5255d6c98df782f934196c3e1e921929f7c3086234bcf25dd75dd7cf62a47038" Mar 10 09:16:54 crc kubenswrapper[4883]: I0310 09:16:54.220947 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.734550 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.735994 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="pull" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736099 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="pull" Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.736168 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="util" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736232 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="util" Mar 10 09:16:59 crc kubenswrapper[4883]: E0310 09:16:59.736289 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736347 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.736534 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb61f8a4-ceed-4f2a-91fe-ead52fb416ee" containerName="extract" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.737048 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.738797 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.739592 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.739726 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7xfw4" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.750124 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.755865 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.857048 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:16:59 crc kubenswrapper[4883]: I0310 09:16:59.875174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2f2\" (UniqueName: \"kubernetes.io/projected/a776287a-5b99-4f43-8d4c-191108392859-kube-api-access-lq2f2\") pod \"nmstate-operator-75c5dccd6c-k4v4s\" (UID: \"a776287a-5b99-4f43-8d4c-191108392859\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.051032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.200417 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s"] Mar 10 09:17:00 crc kubenswrapper[4883]: I0310 09:17:00.261110 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" event={"ID":"a776287a-5b99-4f43-8d4c-191108392859","Type":"ContainerStarted","Data":"a6c962e1c0b3a4aa6c68d195bc11275fd828c7232e1e4003e6b3c87f5bce4a71"} Mar 10 09:17:03 crc kubenswrapper[4883]: I0310 09:17:03.284286 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" event={"ID":"a776287a-5b99-4f43-8d4c-191108392859","Type":"ContainerStarted","Data":"9c5ca8e5aa4cec992567a344c1c28ba7a506efdcd66863b5dfb24607c6bb561d"} Mar 10 09:17:03 crc kubenswrapper[4883]: I0310 09:17:03.302040 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-k4v4s" podStartSLOduration=1.8214824219999999 podStartE2EDuration="4.30201833s" podCreationTimestamp="2026-03-10 09:16:59 +0000 UTC" firstStartedPulling="2026-03-10 09:17:00.208088542 +0000 UTC m=+806.462986430" lastFinishedPulling="2026-03-10 09:17:02.688624449 +0000 UTC m=+808.943522338" observedRunningTime="2026-03-10 09:17:03.301147718 +0000 UTC m=+809.556045607" watchObservedRunningTime="2026-03-10 09:17:03.30201833 +0000 UTC m=+809.556916219" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.152438 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.153306 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.157787 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.158597 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.159940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bnhg7" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.160102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.166650 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.171665 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.195165 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5lcxd"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.195815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.257599 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.258560 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.263752 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.263985 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.265045 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5d6x9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.268421 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316398 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316439 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.316514 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418066 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418154 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418182 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418207 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418250 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: E0310 09:17:04.418290 4883 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-nmstate-lock\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: E0310 09:17:04.418375 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair podName:10ab1e00-47a1-4f9a-a55a-131935759d8d nodeName:}" failed. No retries permitted until 2026-03-10 09:17:04.918354483 +0000 UTC m=+811.173252373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair") pod "nmstate-webhook-786f45cff4-ccbds" (UID: "10ab1e00-47a1-4f9a-a55a-131935759d8d") : secret "openshift-nmstate-webhook" not found Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418298 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418597 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-dbus-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418629 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.418669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-ovs-socket\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.437806 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf67s\" (UniqueName: \"kubernetes.io/projected/10ab1e00-47a1-4f9a-a55a-131935759d8d-kube-api-access-zf67s\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.442161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpckr\" (UniqueName: \"kubernetes.io/projected/291985dd-d623-46ba-9e1b-056dc17d26ed-kube-api-access-zpckr\") pod \"nmstate-metrics-69594cc75-x5lcq\" (UID: \"291985dd-d623-46ba-9e1b-056dc17d26ed\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.442618 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpns9\" (UniqueName: \"kubernetes.io/projected/d9c7e9ee-a0a0-4afe-bd00-872553ca9b32-kube-api-access-fpns9\") pod \"nmstate-handler-5lcxd\" (UID: \"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32\") " pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.450011 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.450856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.465660 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.467801 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.506103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519537 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519600 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519654 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519688 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519721 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519739 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519877 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519910 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.519927 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.521715 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.525514 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.534418 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgnd\" (UniqueName: \"kubernetes.io/projected/805fc4e3-bab7-415e-a190-0ceeda5bd8b7-kube-api-access-9sgnd\") pod \"nmstate-console-plugin-5dcbbd79cf-mr8tf\" (UID: \"805fc4e3-bab7-415e-a190-0ceeda5bd8b7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.575762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620440 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620586 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620606 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620651 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.620672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.621379 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-oauth-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-service-ca\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-trusted-ca-bundle\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.622438 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.624910 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-serving-cert\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.625564 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a27925b-cdd8-4de0-9550-c885b528b9e4-console-oauth-config\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.637725 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4nbq\" (UniqueName: \"kubernetes.io/projected/0a27925b-cdd8-4de0-9550-c885b528b9e4-kube-api-access-q4nbq\") pod \"console-5f5c87b79f-276j9\" (UID: \"0a27925b-cdd8-4de0-9550-c885b528b9e4\") " pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.726551 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf"] Mar 10 09:17:04 crc kubenswrapper[4883]: W0310 09:17:04.731551 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod805fc4e3_bab7_415e_a190_0ceeda5bd8b7.slice/crio-3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23 WatchSource:0}: Error finding container 3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23: Status 404 returned error can't find the container with id 3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23 Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.810025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:04 crc kubenswrapper[4883]: W0310 09:17:04.869176 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod291985dd_d623_46ba_9e1b_056dc17d26ed.slice/crio-a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6 WatchSource:0}: Error finding container a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6: Status 404 returned error can't find the container with id a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6 Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.871072 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-x5lcq"] Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.923963 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:04 crc kubenswrapper[4883]: I0310 09:17:04.927469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/10ab1e00-47a1-4f9a-a55a-131935759d8d-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-ccbds\" (UID: \"10ab1e00-47a1-4f9a-a55a-131935759d8d\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.085459 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.175923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c87b79f-276j9"] Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.248920 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-ccbds"] Mar 10 09:17:05 crc kubenswrapper[4883]: W0310 09:17:05.265675 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ab1e00_47a1_4f9a_a55a_131935759d8d.slice/crio-ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502 WatchSource:0}: Error finding container ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502: Status 404 returned error can't find the container with id ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502 Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.296870 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"a8d3f2702fd40b7e1efb17bf3b72742b5d45250ba1008c7df6e0a0dcf84f60e6"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.298225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5lcxd" event={"ID":"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32","Type":"ContainerStarted","Data":"610e06fd61703a658e69b2efd6e988a3b049126b6a321ba1f4969be1737eed09"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.299457 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" event={"ID":"10ab1e00-47a1-4f9a-a55a-131935759d8d","Type":"ContainerStarted","Data":"ca526d8f45e90da3948c284971dc860238dae2b5339d0df09c43df7e1bb14502"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.302124 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c87b79f-276j9" event={"ID":"0a27925b-cdd8-4de0-9550-c885b528b9e4","Type":"ContainerStarted","Data":"2794f47035450accbeeb6b159c10062d75e8c0dc020a9dc7c6db25644ec8a7f4"} Mar 10 09:17:05 crc kubenswrapper[4883]: I0310 09:17:05.303060 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" event={"ID":"805fc4e3-bab7-415e-a190-0ceeda5bd8b7","Type":"ContainerStarted","Data":"3c63283247090812585319257a6867f37a94f47618e410c0c1307e7ab2b11f23"} Mar 10 09:17:06 crc kubenswrapper[4883]: I0310 09:17:06.312667 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c87b79f-276j9" event={"ID":"0a27925b-cdd8-4de0-9550-c885b528b9e4","Type":"ContainerStarted","Data":"6b37d9d45fcd2e3f298863e909656ecfabcdab5de0e0ca9edabfae145dd305dd"} Mar 10 09:17:06 crc kubenswrapper[4883]: I0310 09:17:06.332993 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f5c87b79f-276j9" podStartSLOduration=2.332970788 podStartE2EDuration="2.332970788s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:17:06.328788682 +0000 UTC m=+812.583686571" watchObservedRunningTime="2026-03-10 09:17:06.332970788 +0000 UTC m=+812.587868668" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.329037 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"60bba0ac4713e076ae773662f22d057c62e6f792cea8147a81fcc4ad2de183e8"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.330501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5lcxd" event={"ID":"d9c7e9ee-a0a0-4afe-bd00-872553ca9b32","Type":"ContainerStarted","Data":"8e743bf98a4b7c19707c708bbe3ae55e6cd99546ee27ae36c8a9a11adef9c198"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.330640 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.332034 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" event={"ID":"10ab1e00-47a1-4f9a-a55a-131935759d8d","Type":"ContainerStarted","Data":"435e91624b781dddcbca037217e0bf0479131e3e631316b8bc63873d53ab0138"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.332099 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.334092 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" event={"ID":"805fc4e3-bab7-415e-a190-0ceeda5bd8b7","Type":"ContainerStarted","Data":"5ba26dfc0001a6d7e6030e41a1476883a8b7f7d62267f47a3fff99518b95f655"} Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.346233 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5lcxd" podStartSLOduration=1.468263403 podStartE2EDuration="4.346216507s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.527352689 +0000 UTC m=+810.782250579" lastFinishedPulling="2026-03-10 09:17:07.405305795 +0000 UTC m=+813.660203683" observedRunningTime="2026-03-10 09:17:08.346102221 +0000 UTC m=+814.601000111" watchObservedRunningTime="2026-03-10 09:17:08.346216507 +0000 UTC m=+814.601114396" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.362202 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mr8tf" podStartSLOduration=1.69055761 podStartE2EDuration="4.362193815s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.733987558 +0000 UTC m=+810.988885448" lastFinishedPulling="2026-03-10 09:17:07.405623764 +0000 UTC m=+813.660521653" observedRunningTime="2026-03-10 09:17:08.361331738 +0000 UTC m=+814.616229627" watchObservedRunningTime="2026-03-10 09:17:08.362193815 +0000 UTC m=+814.617091703" Mar 10 09:17:08 crc kubenswrapper[4883]: I0310 09:17:08.383876 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" podStartSLOduration=2.2504544969999998 podStartE2EDuration="4.383846372s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:05.273796308 +0000 UTC m=+811.528694197" lastFinishedPulling="2026-03-10 09:17:07.407188184 +0000 UTC m=+813.662086072" observedRunningTime="2026-03-10 09:17:08.377926018 +0000 UTC m=+814.632823907" watchObservedRunningTime="2026-03-10 09:17:08.383846372 +0000 UTC m=+814.638744261" Mar 10 09:17:10 crc kubenswrapper[4883]: I0310 09:17:10.359904 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" event={"ID":"291985dd-d623-46ba-9e1b-056dc17d26ed","Type":"ContainerStarted","Data":"7d17991f0f95e570e28efdbbe1e355d52cd0ec8be617f0a9c2372dfae74c99ec"} Mar 10 09:17:10 crc kubenswrapper[4883]: I0310 09:17:10.376640 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-x5lcq" podStartSLOduration=1.267102166 podStartE2EDuration="6.376616702s" podCreationTimestamp="2026-03-10 09:17:04 +0000 UTC" firstStartedPulling="2026-03-10 09:17:04.871243969 +0000 UTC m=+811.126141868" lastFinishedPulling="2026-03-10 09:17:09.980758515 +0000 UTC m=+816.235656404" observedRunningTime="2026-03-10 09:17:10.375269823 +0000 UTC m=+816.630167701" watchObservedRunningTime="2026-03-10 09:17:10.376616702 +0000 UTC m=+816.631514592" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.531443 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5lcxd" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.810698 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.810767 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:14 crc kubenswrapper[4883]: I0310 09:17:14.815655 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:15 crc kubenswrapper[4883]: I0310 09:17:15.393563 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f5c87b79f-276j9" Mar 10 09:17:15 crc kubenswrapper[4883]: I0310 09:17:15.434252 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.448891 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.449374 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.449434 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.450195 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:17:17 crc kubenswrapper[4883]: I0310 09:17:17.450261 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" gracePeriod=600 Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.410992 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" exitCode=0 Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411090 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c"} Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} Mar 10 09:17:18 crc kubenswrapper[4883]: I0310 09:17:18.411758 4883 scope.go:117] "RemoveContainer" containerID="fa3cbb52c5196a50f8eb140640d5ec9255382224cac25f2663c634e049543e13" Mar 10 09:17:25 crc kubenswrapper[4883]: I0310 09:17:25.091903 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-ccbds" Mar 10 09:17:26 crc kubenswrapper[4883]: I0310 09:17:26.112533 4883 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.876435 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.878901 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.882618 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 09:17:36 crc kubenswrapper[4883]: I0310 09:17:36.887555 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028562 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.028645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130168 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130241 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130280 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130783 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.130820 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.149291 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.196628 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.351218 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5"] Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.535618 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerStarted","Data":"0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041"} Mar 10 09:17:37 crc kubenswrapper[4883]: I0310 09:17:37.535673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerStarted","Data":"09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee"} Mar 10 09:17:38 crc kubenswrapper[4883]: I0310 09:17:38.542306 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041" exitCode=0 Mar 10 09:17:38 crc kubenswrapper[4883]: I0310 09:17:38.542365 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"0d126ab653fc0a17d08ec204f3cad246a3ba074a74ae06be7ba81b3541fc4041"} Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.433742 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.435841 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.446850 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.463207 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nbvf4" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" containerID="cri-o://2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" gracePeriod=15 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.554775 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="930e380027834eadef223b69b206531acf59794b14dc67c94e162258576d9598" exitCode=0 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.554824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"930e380027834eadef223b69b206531acf59794b14dc67c94e162258576d9598"} Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570301 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570385 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.570411 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.671983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672034 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672082 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.672599 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.688495 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"redhat-operators-2mq4z\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.748646 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.757680 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbvf4_dd173309-9e96-468f-a21c-f25c86186744/console/0.log" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.757751 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876258 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876311 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876406 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.876443 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") pod \"dd173309-9e96-468f-a21c-f25c86186744\" (UID: \"dd173309-9e96-468f-a21c-f25c86186744\") " Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config" (OuterVolumeSpecName: "console-config") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877368 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.877632 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.878724 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.881209 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.882341 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p" (OuterVolumeSpecName: "kube-api-access-8fw5p") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "kube-api-access-8fw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.883216 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd173309-9e96-468f-a21c-f25c86186744" (UID: "dd173309-9e96-468f-a21c-f25c86186744"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.916770 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:40 crc kubenswrapper[4883]: W0310 09:17:40.923029 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5e84e5_8671_4388_a92e_6ce1ecab3f48.slice/crio-5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45 WatchSource:0}: Error finding container 5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45: Status 404 returned error can't find the container with id 5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45 Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977956 4883 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977986 4883 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.977999 4883 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978011 4883 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978020 4883 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd173309-9e96-468f-a21c-f25c86186744-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978029 4883 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd173309-9e96-468f-a21c-f25c86186744-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:40 crc kubenswrapper[4883]: I0310 09:17:40.978037 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fw5p\" (UniqueName: \"kubernetes.io/projected/dd173309-9e96-468f-a21c-f25c86186744-kube-api-access-8fw5p\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562445 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nbvf4_dd173309-9e96-468f-a21c-f25c86186744/console/0.log" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562793 4883 generic.go:334] "Generic (PLEG): container finished" podID="dd173309-9e96-468f-a21c-f25c86186744" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" exitCode=2 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562871 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nbvf4" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerDied","Data":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nbvf4" event={"ID":"dd173309-9e96-468f-a21c-f25c86186744","Type":"ContainerDied","Data":"4ec8d1b5827960f6bd137ff58dec19e8986121f159740a144f7e580ab0677ffa"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.562976 4883 scope.go:117] "RemoveContainer" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.564942 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" exitCode=0 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.565021 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.565096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.570674 4883 generic.go:334] "Generic (PLEG): container finished" podID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerID="b3534952ad463b08ff6cd601ba9ff1b1621cb24e5aa9c6acbd423bdc05afdde7" exitCode=0 Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.570719 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"b3534952ad463b08ff6cd601ba9ff1b1621cb24e5aa9c6acbd423bdc05afdde7"} Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.578250 4883 scope.go:117] "RemoveContainer" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: E0310 09:17:41.579084 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": container with ID starting with 2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e not found: ID does not exist" containerID="2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.579144 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e"} err="failed to get container status \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": rpc error: code = NotFound desc = could not find container \"2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e\": container with ID starting with 2df371bd0d721d2828d2bc8830494c77b0e9f64eac37040900f75bec70ba2c3e not found: ID does not exist" Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.631612 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:41 crc kubenswrapper[4883]: I0310 09:17:41.649098 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nbvf4"] Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.086731 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd173309-9e96-468f-a21c-f25c86186744" path="/var/lib/kubelet/pods/dd173309-9e96-468f-a21c-f25c86186744/volumes" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.582042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.778954 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905278 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905384 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.905523 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") pod \"8aebf63f-b8d3-496c-a660-c484d574fb63\" (UID: \"8aebf63f-b8d3-496c-a660-c484d574fb63\") " Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.906211 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle" (OuterVolumeSpecName: "bundle") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:42 crc kubenswrapper[4883]: I0310 09:17:42.911216 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j" (OuterVolumeSpecName: "kube-api-access-mz22j") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "kube-api-access-mz22j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.007361 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz22j\" (UniqueName: \"kubernetes.io/projected/8aebf63f-b8d3-496c-a660-c484d574fb63-kube-api-access-mz22j\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.007395 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.094890 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util" (OuterVolumeSpecName: "util") pod "8aebf63f-b8d3-496c-a660-c484d574fb63" (UID: "8aebf63f-b8d3-496c-a660-c484d574fb63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.108064 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8aebf63f-b8d3-496c-a660-c484d574fb63-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.593051 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" exitCode=0 Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.593147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598718 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" event={"ID":"8aebf63f-b8d3-496c-a660-c484d574fb63","Type":"ContainerDied","Data":"09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee"} Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598785 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e5e9966c9e77e8871e74d42d8fc0d89cf7a2a605c521a49f9476f6877182ee" Mar 10 09:17:43 crc kubenswrapper[4883]: I0310 09:17:43.598815 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5" Mar 10 09:17:44 crc kubenswrapper[4883]: I0310 09:17:44.605663 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerStarted","Data":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} Mar 10 09:17:44 crc kubenswrapper[4883]: I0310 09:17:44.629318 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2mq4z" podStartSLOduration=2.146433301 podStartE2EDuration="4.629302049s" podCreationTimestamp="2026-03-10 09:17:40 +0000 UTC" firstStartedPulling="2026-03-10 09:17:41.566214666 +0000 UTC m=+847.821112555" lastFinishedPulling="2026-03-10 09:17:44.049083414 +0000 UTC m=+850.303981303" observedRunningTime="2026-03-10 09:17:44.626264931 +0000 UTC m=+850.881162821" watchObservedRunningTime="2026-03-10 09:17:44.629302049 +0000 UTC m=+850.884199937" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.749432 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.749797 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:50 crc kubenswrapper[4883]: I0310 09:17:50.784440 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:51 crc kubenswrapper[4883]: I0310 09:17:51.697438 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418234 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418443 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418455 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418511 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="pull" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418518 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="pull" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418530 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418546 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: E0310 09:17:52.418555 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="util" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418560 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="util" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418676 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd173309-9e96-468f-a21c-f25c86186744" containerName="console" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.418691 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aebf63f-b8d3-496c-a660-c484d574fb63" containerName="extract" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.419046 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.420382 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421201 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421364 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421516 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.421943 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5p6f5" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.433171 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436105 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.436225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.537656 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.544385 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-webhook-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.544409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5804aa0d-ee19-4fb3-bd39-27c7103571d8-apiservice-cert\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.554121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c69\" (UniqueName: \"kubernetes.io/projected/5804aa0d-ee19-4fb3-bd39-27c7103571d8-kube-api-access-84c69\") pod \"metallb-operator-controller-manager-c79cc77cd-s6vgn\" (UID: \"5804aa0d-ee19-4fb3-bd39-27c7103571d8\") " pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.734387 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.870615 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.871593 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.875502 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.875983 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5zzbs" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.876198 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.882951 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952348 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952391 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:52 crc kubenswrapper[4883]: I0310 09:17:52.952609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.031923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn"] Mar 10 09:17:53 crc kubenswrapper[4883]: W0310 09:17:53.042551 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5804aa0d_ee19_4fb3_bd39_27c7103571d8.slice/crio-69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7 WatchSource:0}: Error finding container 69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7: Status 404 returned error can't find the container with id 69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7 Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053608 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053708 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.053737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.058094 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-apiservice-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.060383 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cb05036e-52f2-48ab-ba84-f89c4565a0af-webhook-cert\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.069367 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n875\" (UniqueName: \"kubernetes.io/projected/cb05036e-52f2-48ab-ba84-f89c4565a0af-kube-api-access-9n875\") pod \"metallb-operator-webhook-server-57848ff665-prp4d\" (UID: \"cb05036e-52f2-48ab-ba84-f89c4565a0af\") " pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.190377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.576378 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57848ff665-prp4d"] Mar 10 09:17:53 crc kubenswrapper[4883]: W0310 09:17:53.583917 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb05036e_52f2_48ab_ba84_f89c4565a0af.slice/crio-928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b WatchSource:0}: Error finding container 928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b: Status 404 returned error can't find the container with id 928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.677341 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" event={"ID":"cb05036e-52f2-48ab-ba84-f89c4565a0af","Type":"ContainerStarted","Data":"928c868bf90255745b261f14c3e93fa385f36d5abf5f54e50f9dfcdb77283a4b"} Mar 10 09:17:53 crc kubenswrapper[4883]: I0310 09:17:53.678681 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" event={"ID":"5804aa0d-ee19-4fb3-bd39-27c7103571d8","Type":"ContainerStarted","Data":"69e140fc9679dd9f8ec82b71ac27155130806adf53b04c436f242093932e69a7"} Mar 10 09:17:54 crc kubenswrapper[4883]: I0310 09:17:54.830339 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:54 crc kubenswrapper[4883]: I0310 09:17:54.831101 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2mq4z" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" containerID="cri-o://97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" gracePeriod=2 Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.264291 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283096 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283144 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.283164 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") pod \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\" (UID: \"8c5e84e5-8671-4388-a92e-6ce1ecab3f48\") " Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.288345 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities" (OuterVolumeSpecName: "utilities") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.290340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn" (OuterVolumeSpecName: "kube-api-access-t5bmn") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "kube-api-access-t5bmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.383988 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.384228 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5bmn\" (UniqueName: \"kubernetes.io/projected/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-kube-api-access-t5bmn\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.397000 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c5e84e5-8671-4388-a92e-6ce1ecab3f48" (UID: "8c5e84e5-8671-4388-a92e-6ce1ecab3f48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.484911 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c5e84e5-8671-4388-a92e-6ce1ecab3f48-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698584 4883 generic.go:334] "Generic (PLEG): container finished" podID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" exitCode=0 Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698670 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2mq4z" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698715 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2mq4z" event={"ID":"8c5e84e5-8671-4388-a92e-6ce1ecab3f48","Type":"ContainerDied","Data":"5d42fd2f91689c5c615a6a94552d6057a6963d3eac75d13eac66d07c14702c45"} Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.698738 4883 scope.go:117] "RemoveContainer" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.727880 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.731513 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2mq4z"] Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.963832 4883 scope.go:117] "RemoveContainer" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:55 crc kubenswrapper[4883]: I0310 09:17:55.995393 4883 scope.go:117] "RemoveContainer" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011308 4883 scope.go:117] "RemoveContainer" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.011712 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": container with ID starting with 97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a not found: ID does not exist" containerID="97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011748 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a"} err="failed to get container status \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": rpc error: code = NotFound desc = could not find container \"97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a\": container with ID starting with 97b049bf9ba22b24301b08b12ce9556da6dd0f01e9633e96f1626984f6e2923a not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.011773 4883 scope.go:117] "RemoveContainer" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.012037 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": container with ID starting with 8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3 not found: ID does not exist" containerID="8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012063 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3"} err="failed to get container status \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": rpc error: code = NotFound desc = could not find container \"8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3\": container with ID starting with 8ba5c386f62cc75fe0522613c2994e7be5a4a594c0a357a9357f9f8809c35cd3 not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012078 4883 scope.go:117] "RemoveContainer" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: E0310 09:17:56.012323 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": container with ID starting with 9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99 not found: ID does not exist" containerID="9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.012344 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99"} err="failed to get container status \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": rpc error: code = NotFound desc = could not find container \"9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99\": container with ID starting with 9d8c0d482d1bc358131ba70cff4e38f95a10040d010c0d0846308d31973fed99 not found: ID does not exist" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.093573 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" path="/var/lib/kubelet/pods/8c5e84e5-8671-4388-a92e-6ce1ecab3f48/volumes" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.709430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" event={"ID":"5804aa0d-ee19-4fb3-bd39-27c7103571d8","Type":"ContainerStarted","Data":"c458dd6596c0b4b74519f686fff3a176d4ad08160f8d0443a5d01e94318379ea"} Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.709762 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:17:56 crc kubenswrapper[4883]: I0310 09:17:56.730352 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" podStartSLOduration=1.777283781 podStartE2EDuration="4.730326453s" podCreationTimestamp="2026-03-10 09:17:52 +0000 UTC" firstStartedPulling="2026-03-10 09:17:53.045565597 +0000 UTC m=+859.300463487" lastFinishedPulling="2026-03-10 09:17:55.99860827 +0000 UTC m=+862.253506159" observedRunningTime="2026-03-10 09:17:56.726177198 +0000 UTC m=+862.981075087" watchObservedRunningTime="2026-03-10 09:17:56.730326453 +0000 UTC m=+862.985224341" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032607 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032872 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032886 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032904 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-content" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032910 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-content" Mar 10 09:17:57 crc kubenswrapper[4883]: E0310 09:17:57.032917 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-utilities" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.032923 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="extract-utilities" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.033039 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5e84e5-8671-4388-a92e-6ce1ecab3f48" containerName="registry-server" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.034964 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.044563 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114367 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114455 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.114517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215588 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215673 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.215724 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.216093 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.216163 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.275444 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"community-operators-xrpkc\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:57 crc kubenswrapper[4883]: I0310 09:17:57.357117 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.271572 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:17:58 crc kubenswrapper[4883]: W0310 09:17:58.278377 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f58f16_8f76_44eb_8788_eb8664952511.slice/crio-9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5 WatchSource:0}: Error finding container 9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5: Status 404 returned error can't find the container with id 9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5 Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725219 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" exitCode=0 Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.725400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.728964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" event={"ID":"cb05036e-52f2-48ab-ba84-f89c4565a0af","Type":"ContainerStarted","Data":"dc94503e5bf9ef7204b9cec124249434bddc81760853cf33b25fd7933cec8722"} Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.729268 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:17:58 crc kubenswrapper[4883]: I0310 09:17:58.764130 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" podStartSLOduration=2.43585861 podStartE2EDuration="6.764107853s" podCreationTimestamp="2026-03-10 09:17:52 +0000 UTC" firstStartedPulling="2026-03-10 09:17:53.587088099 +0000 UTC m=+859.841985988" lastFinishedPulling="2026-03-10 09:17:57.915337343 +0000 UTC m=+864.170235231" observedRunningTime="2026-03-10 09:17:58.759255151 +0000 UTC m=+865.014153041" watchObservedRunningTime="2026-03-10 09:17:58.764107853 +0000 UTC m=+865.019005742" Mar 10 09:17:59 crc kubenswrapper[4883]: I0310 09:17:59.736501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.129316 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.130060 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.131560 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.131938 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.135507 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.137158 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.155275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.255916 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.272240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"auto-csr-approver-29552238-qbbs2\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.441985 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.632257 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.743015 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" exitCode=0 Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.743108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} Mar 10 09:18:00 crc kubenswrapper[4883]: I0310 09:18:00.744227 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerStarted","Data":"7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1"} Mar 10 09:18:01 crc kubenswrapper[4883]: I0310 09:18:01.752212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerStarted","Data":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} Mar 10 09:18:01 crc kubenswrapper[4883]: I0310 09:18:01.777520 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xrpkc" podStartSLOduration=2.264522033 podStartE2EDuration="4.777506423s" podCreationTimestamp="2026-03-10 09:17:57 +0000 UTC" firstStartedPulling="2026-03-10 09:17:58.727347801 +0000 UTC m=+864.982245690" lastFinishedPulling="2026-03-10 09:18:01.240332191 +0000 UTC m=+867.495230080" observedRunningTime="2026-03-10 09:18:01.77371319 +0000 UTC m=+868.028611069" watchObservedRunningTime="2026-03-10 09:18:01.777506423 +0000 UTC m=+868.032404313" Mar 10 09:18:02 crc kubenswrapper[4883]: I0310 09:18:02.761760 4883 generic.go:334] "Generic (PLEG): container finished" podID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerID="e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b" exitCode=0 Mar 10 09:18:02 crc kubenswrapper[4883]: I0310 09:18:02.761811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerDied","Data":"e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b"} Mar 10 09:18:03 crc kubenswrapper[4883]: I0310 09:18:03.983100 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.002843 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") pod \"38171111-f624-438d-ba5a-36f6b9cb29bf\" (UID: \"38171111-f624-438d-ba5a-36f6b9cb29bf\") " Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.008638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9" (OuterVolumeSpecName: "kube-api-access-7s6d9") pod "38171111-f624-438d-ba5a-36f6b9cb29bf" (UID: "38171111-f624-438d-ba5a-36f6b9cb29bf"). InnerVolumeSpecName "kube-api-access-7s6d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.104895 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6d9\" (UniqueName: \"kubernetes.io/projected/38171111-f624-438d-ba5a-36f6b9cb29bf-kube-api-access-7s6d9\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793047 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" event={"ID":"38171111-f624-438d-ba5a-36f6b9cb29bf","Type":"ContainerDied","Data":"7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1"} Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793103 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7bc77b0a8c57d3cf6064703d8a10ff9c4c2187f99d6a1887470fc740fe96c1" Mar 10 09:18:04 crc kubenswrapper[4883]: I0310 09:18:04.793121 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-qbbs2" Mar 10 09:18:05 crc kubenswrapper[4883]: I0310 09:18:05.032282 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:18:05 crc kubenswrapper[4883]: I0310 09:18:05.036872 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-d429x"] Mar 10 09:18:06 crc kubenswrapper[4883]: I0310 09:18:06.085272 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf9a36e-c0e2-4943-a87c-9f6735b2714e" path="/var/lib/kubelet/pods/bbf9a36e-c0e2-4943-a87c-9f6735b2714e/volumes" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.358364 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.358435 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.398300 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:07 crc kubenswrapper[4883]: I0310 09:18:07.849796 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:09 crc kubenswrapper[4883]: I0310 09:18:09.628049 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:09 crc kubenswrapper[4883]: I0310 09:18:09.824672 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xrpkc" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" containerID="cri-o://ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" gracePeriod=2 Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.165559 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285279 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.285429 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") pod \"62f58f16-8f76-44eb-8788-eb8664952511\" (UID: \"62f58f16-8f76-44eb-8788-eb8664952511\") " Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.286186 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities" (OuterVolumeSpecName: "utilities") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.291799 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d" (OuterVolumeSpecName: "kube-api-access-xb84d") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "kube-api-access-xb84d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.328266 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62f58f16-8f76-44eb-8788-eb8664952511" (UID: "62f58f16-8f76-44eb-8788-eb8664952511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386761 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386790 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62f58f16-8f76-44eb-8788-eb8664952511-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.386803 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb84d\" (UniqueName: \"kubernetes.io/projected/62f58f16-8f76-44eb-8788-eb8664952511-kube-api-access-xb84d\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.833990 4883 generic.go:334] "Generic (PLEG): container finished" podID="62f58f16-8f76-44eb-8788-eb8664952511" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" exitCode=0 Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834050 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834067 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrpkc" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834115 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrpkc" event={"ID":"62f58f16-8f76-44eb-8788-eb8664952511","Type":"ContainerDied","Data":"9a78ebb693346faa79509860e04e83398d7b4673c07eb43e461520f2042f1db5"} Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.834159 4883 scope.go:117] "RemoveContainer" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.857960 4883 scope.go:117] "RemoveContainer" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.859660 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.863358 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xrpkc"] Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.877494 4883 scope.go:117] "RemoveContainer" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890208 4883 scope.go:117] "RemoveContainer" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.890487 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": container with ID starting with ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0 not found: ID does not exist" containerID="ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890514 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0"} err="failed to get container status \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": rpc error: code = NotFound desc = could not find container \"ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0\": container with ID starting with ad0262416b9def2f7339fb2982dcd220ba04dd7f72ecf2c176144e098fd37bf0 not found: ID does not exist" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890536 4883 scope.go:117] "RemoveContainer" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.890818 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": container with ID starting with 024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f not found: ID does not exist" containerID="024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890860 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f"} err="failed to get container status \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": rpc error: code = NotFound desc = could not find container \"024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f\": container with ID starting with 024a5a54a4795efad07715135bb1a0a93c370f9323a30a30e707686ce538364f not found: ID does not exist" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.890890 4883 scope.go:117] "RemoveContainer" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: E0310 09:18:10.891126 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": container with ID starting with bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3 not found: ID does not exist" containerID="bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3" Mar 10 09:18:10 crc kubenswrapper[4883]: I0310 09:18:10.891163 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3"} err="failed to get container status \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": rpc error: code = NotFound desc = could not find container \"bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3\": container with ID starting with bd8a67d3183562ca3b04a62c8912f4241c9940139ed8494135751f05c18e3fb3 not found: ID does not exist" Mar 10 09:18:12 crc kubenswrapper[4883]: I0310 09:18:12.087200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f58f16-8f76-44eb-8788-eb8664952511" path="/var/lib/kubelet/pods/62f58f16-8f76-44eb-8788-eb8664952511/volumes" Mar 10 09:18:13 crc kubenswrapper[4883]: I0310 09:18:13.195179 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57848ff665-prp4d" Mar 10 09:18:32 crc kubenswrapper[4883]: I0310 09:18:32.738870 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c79cc77cd-s6vgn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280513 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ck8gb"] Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280905 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280927 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280955 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-utilities" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280962 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-utilities" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280971 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-content" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.280986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="extract-content" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.280996 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281138 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f58f16-8f76-44eb-8788-eb8664952511" containerName="registry-server" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.281150 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" containerName="oc" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.283067 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.284703 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.285193 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.285506 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gf92g" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.288618 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.292156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.301513 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.309959 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.361852 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gtqfn"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.363128 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.371248 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373267 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwvch" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373444 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.373890 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.374587 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.375388 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377410 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377454 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377525 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377548 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377568 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377619 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377673 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377729 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377761 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.377784 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.380177 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.384219 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479117 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479854 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.479928 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480047 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480119 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metallb-excludel2\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480357 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480431 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480455 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480505 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480532 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.480778 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.481030 4883 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.481223 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist podName:6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf nodeName:}" failed. No retries permitted until 2026-03-10 09:18:33.981186958 +0000 UTC m=+900.236084847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist") pod "speaker-gtqfn" (UID: "6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf") : secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.481682 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-startup\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.481980 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-reloader\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.482120 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-conf\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.482237 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d2caf019-bd64-4a5c-bf88-c260178bdc82-frr-sockets\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.489235 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2caf019-bd64-4a5c-bf88-c260178bdc82-metrics-certs\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.489261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e843a56-715a-44fc-9974-8570d49bd9a0-cert\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.491409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-metrics-certs\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498023 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz6k\" (UniqueName: \"kubernetes.io/projected/8e843a56-715a-44fc-9974-8570d49bd9a0-kube-api-access-mnz6k\") pod \"frr-k8s-webhook-server-7f989f654f-shjnr\" (UID: \"8e843a56-715a-44fc-9974-8570d49bd9a0\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498463 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfn9k\" (UniqueName: \"kubernetes.io/projected/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-kube-api-access-xfn9k\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.498813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqm8\" (UniqueName: \"kubernetes.io/projected/d2caf019-bd64-4a5c-bf88-c260178bdc82-kube-api-access-xlqm8\") pod \"frr-k8s-ck8gb\" (UID: \"d2caf019-bd64-4a5c-bf88-c260178bdc82\") " pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582247 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.582299 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.585342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-metrics-certs\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.587822 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59437559-8b42-4779-8b72-17f09b50b572-cert\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.606043 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4cl\" (UniqueName: \"kubernetes.io/projected/59437559-8b42-4779-8b72-17f09b50b572-kube-api-access-9d4cl\") pod \"controller-86ddb6bd46-rtrbh\" (UID: \"59437559-8b42-4779-8b72-17f09b50b572\") " pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.619892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.626692 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.707762 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.987886 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.988094 4883 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: E0310 09:18:33.988405 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist podName:6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf nodeName:}" failed. No retries permitted until 2026-03-10 09:18:34.988385321 +0000 UTC m=+901.243283210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist") pod "speaker-gtqfn" (UID: "6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf") : secret "metallb-memberlist" not found Mar 10 09:18:33 crc kubenswrapper[4883]: I0310 09:18:33.993789 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"c48e6b51db803f7bf091f5c7ab3f4f87e4331cc6ecf296ba1a006b55e6a37b92"} Mar 10 09:18:34 crc kubenswrapper[4883]: W0310 09:18:34.017352 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e843a56_715a_44fc_9974_8570d49bd9a0.slice/crio-5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372 WatchSource:0}: Error finding container 5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372: Status 404 returned error can't find the container with id 5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372 Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.017445 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr"] Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.077936 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtrbh"] Mar 10 09:18:34 crc kubenswrapper[4883]: I0310 09:18:34.466601 4883 scope.go:117] "RemoveContainer" containerID="49c1aa583870be3098bda47d15de71e40f64a8b97a906132b01f7c81a5eefc00" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.010006 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.015834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf-memberlist\") pod \"speaker-gtqfn\" (UID: \"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf\") " pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018606 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"880991a17c808314c7aabf3da988d5cf29fbb2411c907c62cb9d1b158e7b8aa2"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"5e81dfe5f80957735dc460558b44fdd2d2817afe61eb3882c49e0cf69087285e"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.018694 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtrbh" event={"ID":"59437559-8b42-4779-8b72-17f09b50b572","Type":"ContainerStarted","Data":"5adb675d9fa9826ad58302a2eee32aa7abd1fd3c6044f16c288b588b3fb43b8c"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.019515 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.023148 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" event={"ID":"8e843a56-715a-44fc-9974-8570d49bd9a0","Type":"ContainerStarted","Data":"5d70e004bba8bc3d5af01c37fa4408ae1cc4b602a74915aeac0b4e410587c372"} Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.036676 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-rtrbh" podStartSLOduration=2.036665355 podStartE2EDuration="2.036665355s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:18:35.035760249 +0000 UTC m=+901.290658138" watchObservedRunningTime="2026-03-10 09:18:35.036665355 +0000 UTC m=+901.291563245" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.204563 4883 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwvch" Mar 10 09:18:35 crc kubenswrapper[4883]: I0310 09:18:35.213431 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:35 crc kubenswrapper[4883]: W0310 09:18:35.248120 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cecc1fd_5f20_4aff_ae03_570ef8b7dfaf.slice/crio-43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20 WatchSource:0}: Error finding container 43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20: Status 404 returned error can't find the container with id 43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20 Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047019 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"915837ce4d089c07163aea4aab7a71e679e1e6a2e7149eea8ef7b499aa944444"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047373 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"db90983d279711dc94a09992a5c9c9008fa364bc66e5e96b72d128f04ebb263a"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047386 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gtqfn" event={"ID":"6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf","Type":"ContainerStarted","Data":"43ecbda99a36af83821fc853a0701d815e65fb7ee314d8d4ec08d6c8c4052e20"} Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.047657 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:36 crc kubenswrapper[4883]: I0310 09:18:36.071321 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gtqfn" podStartSLOduration=3.071300969 podStartE2EDuration="3.071300969s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:18:36.066756168 +0000 UTC m=+902.321654058" watchObservedRunningTime="2026-03-10 09:18:36.071300969 +0000 UTC m=+902.326198858" Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.081569 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="803db553b8a1019048be01f57234e2f5a8121ba433f9f74d82658dbd3059eb65" exitCode=0 Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.081628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"803db553b8a1019048be01f57234e2f5a8121ba433f9f74d82658dbd3059eb65"} Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.084501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" event={"ID":"8e843a56-715a-44fc-9974-8570d49bd9a0","Type":"ContainerStarted","Data":"b469acbbf38a4f61cf3ab9a2a32dcb7fc38e2a8cc18da5844231e0ff0a1c4ce8"} Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.084680 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:41 crc kubenswrapper[4883]: I0310 09:18:41.123616 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" podStartSLOduration=1.920466868 podStartE2EDuration="8.123591205s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="2026-03-10 09:18:34.020150223 +0000 UTC m=+900.275048113" lastFinishedPulling="2026-03-10 09:18:40.22327456 +0000 UTC m=+906.478172450" observedRunningTime="2026-03-10 09:18:41.122157352 +0000 UTC m=+907.377055241" watchObservedRunningTime="2026-03-10 09:18:41.123591205 +0000 UTC m=+907.378489094" Mar 10 09:18:42 crc kubenswrapper[4883]: I0310 09:18:42.091021 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="a724b4bc1fe6b2a28586bc3ff26b541d7426f05ec41a110b3a474b8b43217bd4" exitCode=0 Mar 10 09:18:42 crc kubenswrapper[4883]: I0310 09:18:42.091659 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"a724b4bc1fe6b2a28586bc3ff26b541d7426f05ec41a110b3a474b8b43217bd4"} Mar 10 09:18:42 crc kubenswrapper[4883]: E0310 09:18:42.335931 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2caf019_bd64_4a5c_bf88_c260178bdc82.slice/crio-conmon-c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2caf019_bd64_4a5c_bf88_c260178bdc82.slice/crio-c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:18:43 crc kubenswrapper[4883]: I0310 09:18:43.099183 4883 generic.go:334] "Generic (PLEG): container finished" podID="d2caf019-bd64-4a5c-bf88-c260178bdc82" containerID="c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9" exitCode=0 Mar 10 09:18:43 crc kubenswrapper[4883]: I0310 09:18:43.099236 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerDied","Data":"c863c467dc5d4ae5a92ff1c179bf60e2a469d4dac4bea0c06dd960d93e869ee9"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111454 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"483735aae593884e5e74863fd2a40b4e3edc5f0768d90ee57ce189fb79e5b8bf"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111923 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"38e8b9c85c7171175477d63def63eaa96b2ed4bbaf562fc6287f3913921c4394"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111953 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"d00d94411ea0996a0950a0a12d64c1b22ff349a21906a38b282821b2e3f64c92"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111967 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"94987766b545c56642566b6b8d244dc8407c123659832a2e0d83159820af6c2d"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"235aa8497009432dfeac99cbc95f7cec78c395b098464539eb1d760ca9a6ae42"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.111988 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ck8gb" event={"ID":"d2caf019-bd64-4a5c-bf88-c260178bdc82","Type":"ContainerStarted","Data":"4e9d98020cc12ea7ff9b05eedf75c4d4bf6e2462e9b351bd3f30d3e8a8699c67"} Mar 10 09:18:44 crc kubenswrapper[4883]: I0310 09:18:44.138872 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ck8gb" podStartSLOduration=4.660067311 podStartE2EDuration="11.138852348s" podCreationTimestamp="2026-03-10 09:18:33 +0000 UTC" firstStartedPulling="2026-03-10 09:18:33.741612368 +0000 UTC m=+899.996510256" lastFinishedPulling="2026-03-10 09:18:40.220397405 +0000 UTC m=+906.475295293" observedRunningTime="2026-03-10 09:18:44.135600506 +0000 UTC m=+910.390498395" watchObservedRunningTime="2026-03-10 09:18:44.138852348 +0000 UTC m=+910.393750237" Mar 10 09:18:45 crc kubenswrapper[4883]: I0310 09:18:45.218630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gtqfn" Mar 10 09:18:48 crc kubenswrapper[4883]: I0310 09:18:48.621563 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:48 crc kubenswrapper[4883]: I0310 09:18:48.652355 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.633269 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.634299 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.636822 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fk56b" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.636829 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.640162 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.641007 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.652283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.754185 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.774532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"openstack-operator-index-kpm7g\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:50 crc kubenswrapper[4883]: I0310 09:18:50.952745 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:51 crc kubenswrapper[4883]: I0310 09:18:51.124004 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:51 crc kubenswrapper[4883]: W0310 09:18:51.128155 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245d059e_ae23_4152_a123_75424f7694e8.slice/crio-e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68 WatchSource:0}: Error finding container e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68: Status 404 returned error can't find the container with id e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68 Mar 10 09:18:51 crc kubenswrapper[4883]: I0310 09:18:51.152821 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerStarted","Data":"e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68"} Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.168219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerStarted","Data":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.180695 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kpm7g" podStartSLOduration=1.984774228 podStartE2EDuration="3.180679043s" podCreationTimestamp="2026-03-10 09:18:50 +0000 UTC" firstStartedPulling="2026-03-10 09:18:51.13088212 +0000 UTC m=+917.385780009" lastFinishedPulling="2026-03-10 09:18:52.326786935 +0000 UTC m=+918.581684824" observedRunningTime="2026-03-10 09:18:53.1787581 +0000 UTC m=+919.433655979" watchObservedRunningTime="2026-03-10 09:18:53.180679043 +0000 UTC m=+919.435576921" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.642726 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-shjnr" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.644152 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ck8gb" Mar 10 09:18:53 crc kubenswrapper[4883]: I0310 09:18:53.712120 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-rtrbh" Mar 10 09:18:55 crc kubenswrapper[4883]: I0310 09:18:55.826080 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:55 crc kubenswrapper[4883]: I0310 09:18:55.826745 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kpm7g" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" containerID="cri-o://5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" gracePeriod=2 Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.185315 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187504 4883 generic.go:334] "Generic (PLEG): container finished" podID="245d059e-ae23-4152-a123-75424f7694e8" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" exitCode=0 Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kpm7g" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187590 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerDied","Data":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kpm7g" event={"ID":"245d059e-ae23-4152-a123-75424f7694e8","Type":"ContainerDied","Data":"e4fd9244672ea674d347fbc9f7d631b9bbdcee2614a9f81730f802d5fd0ace68"} Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.187777 4883 scope.go:117] "RemoveContainer" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.217293 4883 scope.go:117] "RemoveContainer" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: E0310 09:18:56.220955 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": container with ID starting with 5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528 not found: ID does not exist" containerID="5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.221000 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528"} err="failed to get container status \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": rpc error: code = NotFound desc = could not find container \"5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528\": container with ID starting with 5fde06a31a86f41d9bb4e7e772053bd171b09886c48f376ac9eeadf097cfe528 not found: ID does not exist" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.227163 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") pod \"245d059e-ae23-4152-a123-75424f7694e8\" (UID: \"245d059e-ae23-4152-a123-75424f7694e8\") " Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.241704 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd" (OuterVolumeSpecName: "kube-api-access-vpzqd") pod "245d059e-ae23-4152-a123-75424f7694e8" (UID: "245d059e-ae23-4152-a123-75424f7694e8"). InnerVolumeSpecName "kube-api-access-vpzqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.328489 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpzqd\" (UniqueName: \"kubernetes.io/projected/245d059e-ae23-4152-a123-75424f7694e8-kube-api-access-vpzqd\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.434781 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:56 crc kubenswrapper[4883]: E0310 09:18:56.435393 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.435484 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.435706 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="245d059e-ae23-4152-a123-75424f7694e8" containerName="registry-server" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.436532 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.439740 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.509565 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.513027 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kpm7g"] Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.531015 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.631879 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.651095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzk7n\" (UniqueName: \"kubernetes.io/projected/83852eec-509b-4074-b837-4f00d1d07d05-kube-api-access-kzk7n\") pod \"openstack-operator-index-c4vjl\" (UID: \"83852eec-509b-4074-b837-4f00d1d07d05\") " pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.752096 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:18:56 crc kubenswrapper[4883]: I0310 09:18:56.985623 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c4vjl"] Mar 10 09:18:57 crc kubenswrapper[4883]: I0310 09:18:57.197444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c4vjl" event={"ID":"83852eec-509b-4074-b837-4f00d1d07d05","Type":"ContainerStarted","Data":"3b6401b1164602617b1615252e67892602892a83a8f048757c13194f434e9286"} Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.088153 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245d059e-ae23-4152-a123-75424f7694e8" path="/var/lib/kubelet/pods/245d059e-ae23-4152-a123-75424f7694e8/volumes" Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.207911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c4vjl" event={"ID":"83852eec-509b-4074-b837-4f00d1d07d05","Type":"ContainerStarted","Data":"61399a1629beea1d008f65c439f95fd1833bac9b7da534b488fe13a2e0f85b97"} Mar 10 09:18:58 crc kubenswrapper[4883]: I0310 09:18:58.222994 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c4vjl" podStartSLOduration=1.711004628 podStartE2EDuration="2.222975402s" podCreationTimestamp="2026-03-10 09:18:56 +0000 UTC" firstStartedPulling="2026-03-10 09:18:56.9931872 +0000 UTC m=+923.248085089" lastFinishedPulling="2026-03-10 09:18:57.505157974 +0000 UTC m=+923.760055863" observedRunningTime="2026-03-10 09:18:58.219936 +0000 UTC m=+924.474833890" watchObservedRunningTime="2026-03-10 09:18:58.222975402 +0000 UTC m=+924.477873291" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.753005 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.753651 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:06 crc kubenswrapper[4883]: I0310 09:19:06.781000 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:07 crc kubenswrapper[4883]: I0310 09:19:07.289060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c4vjl" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.065331 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.066894 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.069520 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vvsfg" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.075397 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101377 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.101443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203617 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203711 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.203794 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.204567 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.204591 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.224947 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.387037 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:10 crc kubenswrapper[4883]: I0310 09:19:10.749366 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g"] Mar 10 09:19:10 crc kubenswrapper[4883]: W0310 09:19:10.752948 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8a84a3_2cd3_452c_9e28_5bfa45be11c1.slice/crio-e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143 WatchSource:0}: Error finding container e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143: Status 404 returned error can't find the container with id e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143 Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291442 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="51f283cd117e09208a4768930914421d5662d0e2b76ae05bd47783685cc54eaf" exitCode=0 Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291522 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"51f283cd117e09208a4768930914421d5662d0e2b76ae05bd47783685cc54eaf"} Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.291790 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerStarted","Data":"e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143"} Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.292774 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.437658 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.438948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.445483 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520107 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520156 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.520345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621749 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621849 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.621902 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.622247 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.622422 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.638653 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"certified-operators-svjzz\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.753631 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:11 crc kubenswrapper[4883]: I0310 09:19:11.995594 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:12 crc kubenswrapper[4883]: W0310 09:19:12.015572 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e453b8f_c12a_4f46_9727_af420db90b39.slice/crio-b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29 WatchSource:0}: Error finding container b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29: Status 404 returned error can't find the container with id b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29 Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299262 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464" exitCode=0 Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464"} Mar 10 09:19:12 crc kubenswrapper[4883]: I0310 09:19:12.299589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29"} Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.311248 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="443c78311a799b12540fdae003b3a40c61b69151091d181fd46a826a7a5dbc48" exitCode=0 Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.311358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"443c78311a799b12540fdae003b3a40c61b69151091d181fd46a826a7a5dbc48"} Mar 10 09:19:13 crc kubenswrapper[4883]: I0310 09:19:13.315293 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60"} Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.325376 4883 generic.go:334] "Generic (PLEG): container finished" podID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerID="bf7f3352df4eb679b65f63cc131fc1d1e6fbab83b5f535bcabc0431a7fe48488" exitCode=0 Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.325504 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"bf7f3352df4eb679b65f63cc131fc1d1e6fbab83b5f535bcabc0431a7fe48488"} Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.327806 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60" exitCode=0 Mar 10 09:19:14 crc kubenswrapper[4883]: I0310 09:19:14.327845 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60"} Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.339008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerStarted","Data":"dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc"} Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.360706 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-svjzz" podStartSLOduration=1.833153941 podStartE2EDuration="4.360682349s" podCreationTimestamp="2026-03-10 09:19:11 +0000 UTC" firstStartedPulling="2026-03-10 09:19:12.301827657 +0000 UTC m=+938.556725546" lastFinishedPulling="2026-03-10 09:19:14.829356065 +0000 UTC m=+941.084253954" observedRunningTime="2026-03-10 09:19:15.359824271 +0000 UTC m=+941.614722160" watchObservedRunningTime="2026-03-10 09:19:15.360682349 +0000 UTC m=+941.615580238" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.552131 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.685607 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") pod \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\" (UID: \"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1\") " Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.686367 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle" (OuterVolumeSpecName: "bundle") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.691655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb" (OuterVolumeSpecName: "kube-api-access-7m6fb") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "kube-api-access-7m6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.696032 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util" (OuterVolumeSpecName: "util") pod "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" (UID: "9b8a84a3-2cd3-452c-9e28-5bfa45be11c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787165 4883 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787197 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6fb\" (UniqueName: \"kubernetes.io/projected/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-kube-api-access-7m6fb\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:15 crc kubenswrapper[4883]: I0310 09:19:15.787209 4883 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b8a84a3-2cd3-452c-9e28-5bfa45be11c1-util\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346538 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g" event={"ID":"9b8a84a3-2cd3-452c-9e28-5bfa45be11c1","Type":"ContainerDied","Data":"e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143"} Mar 10 09:19:16 crc kubenswrapper[4883]: I0310 09:19:16.346915 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00cc9606a6b490d49652f07709a5f047e9ad69c50ad5359cc899edb1188c143" Mar 10 09:19:17 crc kubenswrapper[4883]: I0310 09:19:17.449578 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:19:17 crc kubenswrapper[4883]: I0310 09:19:17.449876 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.100155 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101493 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="util" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101590 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="util" Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101646 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="pull" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101694 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="pull" Mar 10 09:19:21 crc kubenswrapper[4883]: E0310 09:19:21.101757 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101804 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.101980 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8a84a3-2cd3-452c-9e28-5bfa45be11c1" containerName="extract" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.102501 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.104341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dhgbx" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.118255 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.163400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.264809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.285774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz2d\" (UniqueName: \"kubernetes.io/projected/31e7ec33-4b44-48ce-9f01-e483a7668dd6-kube-api-access-rnz2d\") pod \"openstack-operator-controller-init-6cf8df7788-tzrb8\" (UID: \"31e7ec33-4b44-48ce-9f01-e483a7668dd6\") " pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.416633 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.754542 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.754910 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.888944 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8"] Mar 10 09:19:21 crc kubenswrapper[4883]: I0310 09:19:21.911659 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:22 crc kubenswrapper[4883]: I0310 09:19:22.388291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" event={"ID":"31e7ec33-4b44-48ce-9f01-e483a7668dd6","Type":"ContainerStarted","Data":"89fb5b3a1d23f68f7a9631050a7e369a2662a99ba7164324fe89cf32c693b4e3"} Mar 10 09:19:22 crc kubenswrapper[4883]: I0310 09:19:22.416617 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.028552 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.029133 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-svjzz" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" containerID="cri-o://dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" gracePeriod=2 Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.412016 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e453b8f-c12a-4f46-9727-af420db90b39" containerID="dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" exitCode=0 Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.412274 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc"} Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.738418 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939185 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939234 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.939301 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") pod \"7e453b8f-c12a-4f46-9727-af420db90b39\" (UID: \"7e453b8f-c12a-4f46-9727-af420db90b39\") " Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.940251 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities" (OuterVolumeSpecName: "utilities") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.944983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw" (OuterVolumeSpecName: "kube-api-access-q28dw") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "kube-api-access-q28dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:25 crc kubenswrapper[4883]: I0310 09:19:25.977333 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e453b8f-c12a-4f46-9727-af420db90b39" (UID: "7e453b8f-c12a-4f46-9727-af420db90b39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041219 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q28dw\" (UniqueName: \"kubernetes.io/projected/7e453b8f-c12a-4f46-9727-af420db90b39-kube-api-access-q28dw\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041255 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.041267 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e453b8f-c12a-4f46-9727-af420db90b39-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420634 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-svjzz" event={"ID":"7e453b8f-c12a-4f46-9727-af420db90b39","Type":"ContainerDied","Data":"b37781c041b0a6f6c9f87038fbce7700b9a082ca2d8b50119f237b3c9e6d6b29"} Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420695 4883 scope.go:117] "RemoveContainer" containerID="dcc56ddb1c3d54c1e1f3561c4de8ae2c9c322ad4aebd1cb8df9e304aabe5bdbc" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.420749 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-svjzz" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.437648 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.438497 4883 scope.go:117] "RemoveContainer" containerID="10600bc19ee71dbc79e67e3d012f88347a66f894fddbad0aba7be8ddb5c9da60" Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.441171 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-svjzz"] Mar 10 09:19:26 crc kubenswrapper[4883]: I0310 09:19:26.467085 4883 scope.go:117] "RemoveContainer" containerID="b2eb0240dcdf5c7c8c770a8f07fe72cb183377dcb0f81a38200cee2f1f8d2464" Mar 10 09:19:28 crc kubenswrapper[4883]: I0310 09:19:28.087082 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" path="/var/lib/kubelet/pods/7e453b8f-c12a-4f46-9727-af420db90b39/volumes" Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.453919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" event={"ID":"31e7ec33-4b44-48ce-9f01-e483a7668dd6","Type":"ContainerStarted","Data":"3f27e1efe85e06d1b494a4ba25ff99d7e3d20a593393b2d56bc8d7ee80921fbb"} Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.454404 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:31 crc kubenswrapper[4883]: I0310 09:19:31.484840 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" podStartSLOduration=1.132157941 podStartE2EDuration="10.484818575s" podCreationTimestamp="2026-03-10 09:19:21 +0000 UTC" firstStartedPulling="2026-03-10 09:19:21.896364131 +0000 UTC m=+948.151262021" lastFinishedPulling="2026-03-10 09:19:31.249024766 +0000 UTC m=+957.503922655" observedRunningTime="2026-03-10 09:19:31.478708764 +0000 UTC m=+957.733606652" watchObservedRunningTime="2026-03-10 09:19:31.484818575 +0000 UTC m=+957.739716464" Mar 10 09:19:41 crc kubenswrapper[4883]: I0310 09:19:41.420317 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6cf8df7788-tzrb8" Mar 10 09:19:47 crc kubenswrapper[4883]: I0310 09:19:47.449659 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:19:47 crc kubenswrapper[4883]: I0310 09:19:47.451108 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.130784 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131446 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131459 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131491 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131498 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: E0310 09:20:00.131515 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131522 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131624 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e453b8f-c12a-4f46-9727-af420db90b39" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.131997 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.133741 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.134088 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.135803 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.138468 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.324557 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.426800 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.446336 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"auto-csr-approver-29552240-29nz4\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.446596 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.644115 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:20:00 crc kubenswrapper[4883]: I0310 09:20:00.664374 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerStarted","Data":"e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a"} Mar 10 09:20:02 crc kubenswrapper[4883]: I0310 09:20:02.680529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerStarted","Data":"7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5"} Mar 10 09:20:02 crc kubenswrapper[4883]: I0310 09:20:02.707195 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552240-29nz4" podStartSLOduration=0.970941138 podStartE2EDuration="2.707181897s" podCreationTimestamp="2026-03-10 09:20:00 +0000 UTC" firstStartedPulling="2026-03-10 09:20:00.650129202 +0000 UTC m=+986.905027091" lastFinishedPulling="2026-03-10 09:20:02.386369961 +0000 UTC m=+988.641267850" observedRunningTime="2026-03-10 09:20:02.703788489 +0000 UTC m=+988.958686378" watchObservedRunningTime="2026-03-10 09:20:02.707181897 +0000 UTC m=+988.962079786" Mar 10 09:20:03 crc kubenswrapper[4883]: I0310 09:20:03.691239 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerID="7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5" exitCode=0 Mar 10 09:20:03 crc kubenswrapper[4883]: I0310 09:20:03.691347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerDied","Data":"7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5"} Mar 10 09:20:04 crc kubenswrapper[4883]: I0310 09:20:04.926755 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.089389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") pod \"ed80b911-07e4-45b8-9324-dfdf65e5a508\" (UID: \"ed80b911-07e4-45b8-9324-dfdf65e5a508\") " Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.095786 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw" (OuterVolumeSpecName: "kube-api-access-c42zw") pod "ed80b911-07e4-45b8-9324-dfdf65e5a508" (UID: "ed80b911-07e4-45b8-9324-dfdf65e5a508"). InnerVolumeSpecName "kube-api-access-c42zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.191177 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42zw\" (UniqueName: \"kubernetes.io/projected/ed80b911-07e4-45b8-9324-dfdf65e5a508-kube-api-access-c42zw\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-29nz4" event={"ID":"ed80b911-07e4-45b8-9324-dfdf65e5a508","Type":"ContainerDied","Data":"e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a"} Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704974 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-29nz4" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.704995 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00c5f24a8113bedb1556c765ed6f8bd88f0d1b5772cf393aac0e1f09785480a" Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.747211 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:20:05 crc kubenswrapper[4883]: I0310 09:20:05.753835 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-ftnh5"] Mar 10 09:20:06 crc kubenswrapper[4883]: I0310 09:20:06.087509 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9" path="/var/lib/kubelet/pods/80bb4328-8f40-40cc-b9a8-5dfeb0d8fdd9/volumes" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.101331 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.102078 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102092 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102219 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" containerName="oc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.102595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.106196 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6mtmj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.110613 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.111591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.113302 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2nbzk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.115596 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.119235 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.120071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.121451 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hscmq" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.125196 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134610 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134663 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.134959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.142814 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.146655 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.147447 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.150421 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-sk8jb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.163327 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.164019 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.168005 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7wflq" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.180552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.189813 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.190583 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.191817 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-q258g" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.193617 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.199629 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.214935 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.221390 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.221936 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.222022 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225642 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hzg6c" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225789 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.225909 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4ld57" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.227616 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.228382 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.237771 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6mfs8" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239149 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239209 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.239316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.257650 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.263668 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.296067 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.299627 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2smxq\" (UniqueName: \"kubernetes.io/projected/9a394c48-31ca-4e99-b210-45ae6f67faaa-kube-api-access-2smxq\") pod \"designate-operator-controller-manager-66d56f6ff4-h2cxw\" (UID: \"9a394c48-31ca-4e99-b210-45ae6f67faaa\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.326086 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kwbw\" (UniqueName: \"kubernetes.io/projected/09a04267-a914-4c55-add8-735a053038d3-kube-api-access-7kwbw\") pod \"cinder-operator-controller-manager-984cd4dcf-nzdsk\" (UID: \"09a04267-a914-4c55-add8-735a053038d3\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.329675 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bds\" (UniqueName: \"kubernetes.io/projected/ac18771f-5f45-40d8-b275-38e2e1c48ba6-kube-api-access-72bds\") pod \"barbican-operator-controller-manager-677bd678f7-q52nj\" (UID: \"ac18771f-5f45-40d8-b275-38e2e1c48ba6\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.344764 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346063 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346117 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346217 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346240 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346267 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346292 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.346327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.348797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.354155 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kxnph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.358795 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.362718 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.367423 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mzfp6" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.372949 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.387425 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.394622 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.395729 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.406974 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-m49wc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.419181 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.433860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.436400 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.436665 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453367 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453456 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453574 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453669 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453837 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.453983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.454060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.455061 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.455194 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:08.955177298 +0000 UTC m=+995.210075187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.458616 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.459570 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.462807 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2qxwv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.474542 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.475508 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.477178 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/63474f68-d09d-4822-b650-96a37aead592-kube-api-access-5zmxb\") pod \"glance-operator-controller-manager-5964f64c48-w9dbp\" (UID: \"63474f68-d09d-4822-b650-96a37aead592\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.478641 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bs9kv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.482558 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.497468 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.497952 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6dl\" (UniqueName: \"kubernetes.io/projected/ad93994a-26d2-4353-80be-456c1311020e-kube-api-access-6h6dl\") pod \"keystone-operator-controller-manager-684f77d66d-v5kxw\" (UID: \"ad93994a-26d2-4353-80be-456c1311020e\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.498535 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.499305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8llj\" (UniqueName: \"kubernetes.io/projected/884f7bcb-08ef-49f3-912b-ca921e342615-kube-api-access-l8llj\") pod \"ironic-operator-controller-manager-6bbb499bbc-txdwh\" (UID: \"884f7bcb-08ef-49f3-912b-ca921e342615\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.504054 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-f4pm9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.504552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.510125 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fpv7\" (UniqueName: \"kubernetes.io/projected/c994e4ad-140c-4655-ad69-e4013406d12e-kube-api-access-6fpv7\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.514518 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.518781 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.519188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mzzc\" (UniqueName: \"kubernetes.io/projected/bf027c79-6bdb-4cfb-8c31-d785b80e2231-kube-api-access-5mzzc\") pod \"heat-operator-controller-manager-77b6666d85-mbxnn\" (UID: \"bf027c79-6bdb-4cfb-8c31-d785b80e2231\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.519804 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.522088 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qls6d\" (UniqueName: \"kubernetes.io/projected/8a4cb5eb-0894-440e-8cfd-448651696a6f-kube-api-access-qls6d\") pod \"horizon-operator-controller-manager-6d9d6b584d-fvwbt\" (UID: \"8a4cb5eb-0894-440e-8cfd-448651696a6f\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.528702 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.529463 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.537556 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.538429 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540563 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cggl5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540641 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.540882 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nh9bc" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.542190 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.545784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.551356 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8flq2" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.551555 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.557974 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558098 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558163 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558288 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558381 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.558411 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.559331 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.573559 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.576056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.578093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jhct9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.583376 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.585678 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.596159 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s6xm\" (UniqueName: \"kubernetes.io/projected/91415f40-08a2-451b-abe8-38c7b447e66f-kube-api-access-2s6xm\") pod \"neutron-operator-controller-manager-776c5696bf-snvh5\" (UID: \"91415f40-08a2-451b-abe8-38c7b447e66f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.600121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9twc\" (UniqueName: \"kubernetes.io/projected/ec624ec4-966f-410c-95c7-73be0f9cad27-kube-api-access-m9twc\") pod \"mariadb-operator-controller-manager-658d4cdd5-kz9sv\" (UID: \"ec624ec4-966f-410c-95c7-73be0f9cad27\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.600321 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.607930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62r6d\" (UniqueName: \"kubernetes.io/projected/8b177c77-d85f-4374-b6db-a700719c1282-kube-api-access-62r6d\") pod \"manila-operator-controller-manager-68f45f9d9f-dgrlb\" (UID: \"8b177c77-d85f-4374-b6db-a700719c1282\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.625863 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.626794 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.631604 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nvmll" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.643580 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661821 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661854 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661883 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.661945 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662031 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662064 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.662273 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.662222 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.662435 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.162416994 +0000 UTC m=+995.417314883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.681174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnlh\" (UniqueName: \"kubernetes.io/projected/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-kube-api-access-2dnlh\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.684710 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.684709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bw2b\" (UniqueName: \"kubernetes.io/projected/04b3aecb-7cfd-4042-b003-4bc8c339aff8-kube-api-access-9bw2b\") pod \"placement-operator-controller-manager-574d45c66c-pppd9\" (UID: \"04b3aecb-7cfd-4042-b003-4bc8c339aff8\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.685260 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.727661 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztg6p\" (UniqueName: \"kubernetes.io/projected/1b429bd6-00de-4cc2-8a18-9f58897b6834-kube-api-access-ztg6p\") pod \"swift-operator-controller-manager-677c674df7-m6wph\" (UID: \"1b429bd6-00de-4cc2-8a18-9f58897b6834\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.730300 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7k9\" (UniqueName: \"kubernetes.io/projected/d0e08342-2d1b-42d9-921e-1d948f701a58-kube-api-access-fr7k9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-49gjk\" (UID: \"d0e08342-2d1b-42d9-921e-1d948f701a58\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.731298 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28zzb\" (UniqueName: \"kubernetes.io/projected/c13f33e2-dd6a-4ca0-91e7-5489c753e273-kube-api-access-28zzb\") pod \"ovn-operator-controller-manager-bbc5b68f9-qnwgj\" (UID: \"c13f33e2-dd6a-4ca0-91e7-5489c753e273\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.733384 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.736145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8h6h\" (UniqueName: \"kubernetes.io/projected/760c8dff-c64a-492b-a778-45ef16d197bd-kube-api-access-g8h6h\") pod \"nova-operator-controller-manager-569cc54c5-rpwdx\" (UID: \"760c8dff-c64a-492b-a778-45ef16d197bd\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.766185 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.780266 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.780529 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.786333 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.799833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nmg\" (UniqueName: \"kubernetes.io/projected/3f4c2998-b51a-4620-b674-60bb0817eb7d-kube-api-access-88nmg\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mkjnt\" (UID: \"3f4c2998-b51a-4620-b674-60bb0817eb7d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.801617 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.809709 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.813016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.814857 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.816120 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.818918 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-j28tm" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.824275 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.833654 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwgl\" (UniqueName: \"kubernetes.io/projected/d3d3c04d-7e05-4df2-85c6-394d0bde1a69-kube-api-access-7vwgl\") pod \"test-operator-controller-manager-5c5cb9c4d7-8mpp4\" (UID: \"d3d3c04d-7e05-4df2-85c6-394d0bde1a69\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.841669 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.863644 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.864891 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rpzgd" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871535 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.871658 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.875580 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.882727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.891319 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.899078 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.915989 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.918377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.919455 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.922530 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mzvkm" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.934233 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.941293 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989121 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989213 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989240 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989335 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:08 crc kubenswrapper[4883]: I0310 09:20:08.989383 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.989575 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:08 crc kubenswrapper[4883]: E0310 09:20:08.989635 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.989617487 +0000 UTC m=+996.244515376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.017845 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcgt\" (UniqueName: \"kubernetes.io/projected/a7216675-a296-4faa-9dd5-d857b15ffa3c-kube-api-access-9qcgt\") pod \"watcher-operator-controller-manager-6dd88c6f67-rkjsw\" (UID: \"a7216675-a296-4faa-9dd5-d857b15ffa3c\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090649 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.090666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.090833 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.090877 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.590864376 +0000 UTC m=+995.845762265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.091463 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.091550 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:09.591535281 +0000 UTC m=+995.846433170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.109279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzr7f\" (UniqueName: \"kubernetes.io/projected/969b2d39-fb99-42df-8e6e-3ded5cd292c8-kube-api-access-hzr7f\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.110492 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9mf\" (UniqueName: \"kubernetes.io/projected/475c1190-6d94-431a-943d-4e749ea87d6b-kube-api-access-ll9mf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjjsn\" (UID: \"475c1190-6d94-431a-943d-4e749ea87d6b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.191168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.191748 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.191973 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.192025 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.192010125 +0000 UTC m=+996.446908014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.239255 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.413225 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.419323 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.422711 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.439705 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad93994a_26d2_4353_80be_456c1311020e.slice/crio-5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25 WatchSource:0}: Error finding container 5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25: Status 404 returned error can't find the container with id 5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.500616 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.504968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.573608 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.578779 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.583496 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.585902 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b3aecb_7cfd_4042_b003_4bc8c339aff8.slice/crio-60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c WatchSource:0}: Error finding container 60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c: Status 404 returned error can't find the container with id 60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.586296 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a4cb5eb_0894_440e_8cfd_448651696a6f.slice/crio-e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531 WatchSource:0}: Error finding container e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531: Status 404 returned error can't find the container with id e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.595955 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.598980 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.599023 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599191 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599245 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.599229869 +0000 UTC m=+996.854127759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599279 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.599384 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:10.599359093 +0000 UTC m=+996.854256982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.600013 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf027c79_6bdb_4cfb_8c31_d785b80e2231.slice/crio-44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a WatchSource:0}: Error finding container 44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a: Status 404 returned error can't find the container with id 44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.605554 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.611397 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s6xm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-snvh5_openstack-operators(91415f40-08a2-451b-abe8-38c7b447e66f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.612885 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.614724 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.618632 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.620780 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b177c77_d85f_4374_b6db_a700719c1282.slice/crio-70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568 WatchSource:0}: Error finding container 70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568: Status 404 returned error can't find the container with id 70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.622100 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.622843 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec624ec4_966f_410c_95c7_73be0f9cad27.slice/crio-8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803 WatchSource:0}: Error finding container 8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803: Status 404 returned error can't find the container with id 8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.623169 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62r6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-dgrlb_openstack-operators(8b177c77-d85f-4374-b6db-a700719c1282): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.624859 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.627588 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m9twc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-kz9sv_openstack-operators(ec624ec4-966f-410c-95c7-73be0f9cad27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.628811 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.769091 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" event={"ID":"09a04267-a914-4c55-add8-735a053038d3","Type":"ContainerStarted","Data":"57b7d167675581f7d77dbe6719ce7571cfacfbf3a9df39d8dcb201f7b39c4efd"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.773620 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" event={"ID":"884f7bcb-08ef-49f3-912b-ca921e342615","Type":"ContainerStarted","Data":"47b69c6469e9c0fda49dc764279cc1c2fadd463dd8a89ae2d0549de32d4aaede"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.775073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" event={"ID":"bf027c79-6bdb-4cfb-8c31-d785b80e2231","Type":"ContainerStarted","Data":"44cd699c14bc3891cde617e4b29e7c7a5c4ee18b7a9fe3dcaff0fb824962684a"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.775116 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.776145 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" event={"ID":"9a394c48-31ca-4e99-b210-45ae6f67faaa","Type":"ContainerStarted","Data":"168479b3a9b0d2df2917b7917bf2f844296689cced63fd65042a333c2530e9f0"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.776917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" event={"ID":"63474f68-d09d-4822-b650-96a37aead592","Type":"ContainerStarted","Data":"83d1f4358fd5f31070cdf493abed356016a218487b8aac79e0f7df81791ff9fd"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.779591 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.779993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" event={"ID":"8a4cb5eb-0894-440e-8cfd-448651696a6f","Type":"ContainerStarted","Data":"e9edadaaaaecb94eeaf9e713b4d29c8e53243c97e501e6db6e0f86f8d185c531"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.783336 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.789462 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-88nmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_openstack-operators(3f4c2998-b51a-4620-b674-60bb0817eb7d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.789470 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fr7k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-49gjk_openstack-operators(d0e08342-2d1b-42d9-921e-1d948f701a58): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.790252 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" event={"ID":"04b3aecb-7cfd-4042-b003-4bc8c339aff8","Type":"ContainerStarted","Data":"60bbb8bffb79515b3a41c2c1f2b07bef370cae406c41ccd07dc12cc8fb433b6c"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.790793 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.790839 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.792998 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj"] Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.795927 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13f33e2_dd6a_4ca0_91e7_5489c753e273.slice/crio-95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262 WatchSource:0}: Error finding container 95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262: Status 404 returned error can't find the container with id 95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262 Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.796143 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" event={"ID":"91415f40-08a2-451b-abe8-38c7b447e66f","Type":"ContainerStarted","Data":"7a894b2192ecbc4bf04b66f086babcfd753f613e3042b73a33afc3dff20e3446"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.797348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" event={"ID":"ec624ec4-966f-410c-95c7-73be0f9cad27","Type":"ContainerStarted","Data":"8ed5f266616163b2dab0d917ea7973f91352898c5fc26400c3adc4f5c0fc8803"} Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.799089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" event={"ID":"760c8dff-c64a-492b-a778-45ef16d197bd","Type":"ContainerStarted","Data":"54c69f7c05dc1ca1473350fb37daec194b8955b9bca06697380d6641a56bf5ba"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.800456 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.800979 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.801405 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" event={"ID":"8b177c77-d85f-4374-b6db-a700719c1282","Type":"ContainerStarted","Data":"70a38461ea45c63c7783503e6dd9bde5a86b428afbbe53a9331303eb17d10568"} Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.802664 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b429bd6_00de_4cc2_8a18_9f58897b6834.slice/crio-4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727 WatchSource:0}: Error finding container 4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727: Status 404 returned error can't find the container with id 4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.802786 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.803229 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" event={"ID":"ac18771f-5f45-40d8-b275-38e2e1c48ba6","Type":"ContainerStarted","Data":"0d3e72f814efc2be6fc92c843167a3a5ca521b1a88c10692ca238e1474290b62"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.803315 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28zzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-qnwgj_openstack-operators(c13f33e2-dd6a-4ca0-91e7-5489c753e273): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: W0310 09:20:09.803836 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7216675_a296_4faa_9dd5_d857b15ffa3c.slice/crio-e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4 WatchSource:0}: Error finding container e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4: Status 404 returned error can't find the container with id e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4 Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.804415 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.805383 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" event={"ID":"ad93994a-26d2-4353-80be-456c1311020e","Type":"ContainerStarted","Data":"5b38c901f6b7f89ef7b83e3b72b180ef7efe84c1ca1ac084fa760daeb3bbbb25"} Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.806309 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9qcgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-rkjsw_openstack-operators(a7216675-a296-4faa-9dd5-d857b15ffa3c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.807142 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztg6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-m6wph_openstack-operators(1b429bd6-00de-4cc2-8a18-9f58897b6834): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.808151 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.808326 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.815567 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw"] Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.818599 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ll9mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pjjsn_openstack-operators(475c1190-6d94-431a-943d-4e749ea87d6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 09:20:09 crc kubenswrapper[4883]: E0310 09:20:09.819864 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.821940 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m6wph"] Mar 10 09:20:09 crc kubenswrapper[4883]: I0310 09:20:09.829115 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn"] Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.004791 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.005017 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.005126 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.005103025 +0000 UTC m=+998.260000914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.208371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.208684 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.208822 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.208791216 +0000 UTC m=+998.463689104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.614270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.614634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614510 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614776 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614779 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.614757447 +0000 UTC m=+998.869655337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.614851 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:12.614836346 +0000 UTC m=+998.869734235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.816301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" event={"ID":"1b429bd6-00de-4cc2-8a18-9f58897b6834","Type":"ContainerStarted","Data":"4eb7fb72943ec2e70cd70ab6beea51a9c9dc109f24845cd41cef0da72a4ae727"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.818129 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.818510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" event={"ID":"3f4c2998-b51a-4620-b674-60bb0817eb7d","Type":"ContainerStarted","Data":"9e6c01249661be687c2ade9349da3c4b471b06c572b3480c83adf1edb0dbdb75"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.819765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.821545 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" event={"ID":"d0e08342-2d1b-42d9-921e-1d948f701a58","Type":"ContainerStarted","Data":"83be3ce4f1134f8106165b66a365c7f6385a705befea29b15acd0ad9c321bea9"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.822729 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.823095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" event={"ID":"a7216675-a296-4faa-9dd5-d857b15ffa3c","Type":"ContainerStarted","Data":"e562d4a75b9b0f77867ac5d95482e2f08321e1371a7015804516057212f015a4"} Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.828189 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" event={"ID":"c13f33e2-dd6a-4ca0-91e7-5489c753e273","Type":"ContainerStarted","Data":"95d2af89ebbe9034f960439b27501b09e2bac4eb8f1ce5a69820634941438262"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.828769 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.829835 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.831942 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" event={"ID":"d3d3c04d-7e05-4df2-85c6-394d0bde1a69","Type":"ContainerStarted","Data":"20419ef0a719d8f320d143264e1238ecc015d808842afc2b967ed0ef75b655ec"} Mar 10 09:20:10 crc kubenswrapper[4883]: I0310 09:20:10.835210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" event={"ID":"475c1190-6d94-431a-943d-4e749ea87d6b","Type":"ContainerStarted","Data":"ae1c00df4359638bd98f4acdf16b26bc4b854e6f9f48cdab9c86749e576e2478"} Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838879 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838944 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podUID="91415f40-08a2-451b-abe8-38c7b447e66f" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.838987 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podUID="ec624ec4-966f-410c-95c7-73be0f9cad27" Mar 10 09:20:10 crc kubenswrapper[4883]: E0310 09:20:10.842107 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podUID="8b177c77-d85f-4374-b6db-a700719c1282" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.847737 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podUID="475c1190-6d94-431a-943d-4e749ea87d6b" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848265 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podUID="a7216675-a296-4faa-9dd5-d857b15ffa3c" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848313 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podUID="1b429bd6-00de-4cc2-8a18-9f58897b6834" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848385 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podUID="3f4c2998-b51a-4620-b674-60bb0817eb7d" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848431 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podUID="d0e08342-2d1b-42d9-921e-1d948f701a58" Mar 10 09:20:11 crc kubenswrapper[4883]: E0310 09:20:11.848559 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podUID="c13f33e2-dd6a-4ca0-91e7-5489c753e273" Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.040926 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.041101 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.041192 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.041170432 +0000 UTC m=+1002.296068320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.248610 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.248796 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.249098 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.249075193 +0000 UTC m=+1002.503973082 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.653912 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:12 crc kubenswrapper[4883]: I0310 09:20:12.653975 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654268 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654315 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654367 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.654347627 +0000 UTC m=+1002.909245515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:12 crc kubenswrapper[4883]: E0310 09:20:12.654412 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:16.65439203 +0000 UTC m=+1002.909289918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.112609 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.112933 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.113548 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.113523186 +0000 UTC m=+1010.368421064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.317104 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.317351 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.317440 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.317409269 +0000 UTC m=+1010.572307159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.723729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:16 crc kubenswrapper[4883]: I0310 09:20:16.723807 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.723921 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724002 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.723980813 +0000 UTC m=+1010.978878702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724086 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:16 crc kubenswrapper[4883]: E0310 09:20:16.724171 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:24.724135013 +0000 UTC m=+1010.979032892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448582 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448815 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.448858 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.449262 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.449301 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" gracePeriod=600 Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913771 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" exitCode=0 Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913814 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e"} Mar 10 09:20:17 crc kubenswrapper[4883]: I0310 09:20:17.913845 4883 scope.go:117] "RemoveContainer" containerID="263ff00fc2241dc1e88d637d22f488b48ffd25d14a523d99a3849ab10808063c" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.922652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" event={"ID":"d3d3c04d-7e05-4df2-85c6-394d0bde1a69","Type":"ContainerStarted","Data":"7e23f10ff2ef9bcc37e8bd3393d601e3c367e0ac7cff2e0a201fea76c5c58a18"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.923362 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.924794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" event={"ID":"8a4cb5eb-0894-440e-8cfd-448651696a6f","Type":"ContainerStarted","Data":"e7a91213ba34c8c1a78394a0fc3403d4f60c5f586f936b099529943421d6cad5"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.924890 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.926928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" event={"ID":"884f7bcb-08ef-49f3-912b-ca921e342615","Type":"ContainerStarted","Data":"1cc064a3ba89d1674b9dc0011507e630993bd607b597a4606a644ee2dcbdecc6"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.927084 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.928738 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" event={"ID":"bf027c79-6bdb-4cfb-8c31-d785b80e2231","Type":"ContainerStarted","Data":"dc98f8b9ed49b5a1d62726c263856a3bade2c5ba6e98e23146118deed6031412"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.928794 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.930419 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" event={"ID":"09a04267-a914-4c55-add8-735a053038d3","Type":"ContainerStarted","Data":"111e96d855f304b51a0130b8400e37afdcb6ad41be6828cc22d88f5462faa519"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.930514 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.933441 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.934835 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" event={"ID":"760c8dff-c64a-492b-a778-45ef16d197bd","Type":"ContainerStarted","Data":"a9b77e4ec829f7f1da64203e5c1487f1dc19a319bd5c353083d660f457a53249"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.934952 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.936567 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" event={"ID":"9a394c48-31ca-4e99-b210-45ae6f67faaa","Type":"ContainerStarted","Data":"768f33138eb2722dccdf0f5396f1112ba1711d3e4a647933f731a1bba199bb8d"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.936661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.938211 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" event={"ID":"63474f68-d09d-4822-b650-96a37aead592","Type":"ContainerStarted","Data":"a2b30f0071070c883067425c34be93fc6509f16a62dac19c638343e99715fa77"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.938363 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.942372 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" podStartSLOduration=2.207306025 podStartE2EDuration="10.942355855s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.78124318 +0000 UTC m=+996.036141069" lastFinishedPulling="2026-03-10 09:20:18.516293009 +0000 UTC m=+1004.771190899" observedRunningTime="2026-03-10 09:20:18.93806871 +0000 UTC m=+1005.192966599" watchObservedRunningTime="2026-03-10 09:20:18.942355855 +0000 UTC m=+1005.197253744" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.945541 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" event={"ID":"ac18771f-5f45-40d8-b275-38e2e1c48ba6","Type":"ContainerStarted","Data":"7dda4daa1fb1649747b31147546d2ad5ec9a2d80673842518fe392ff7c43f7d2"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.945954 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.947390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" event={"ID":"04b3aecb-7cfd-4042-b003-4bc8c339aff8","Type":"ContainerStarted","Data":"343a950e4152184cdf00aae8d1d145bbc817dce45e0c5e74877e6bc1749e49b9"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.947517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.948813 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" event={"ID":"ad93994a-26d2-4353-80be-456c1311020e","Type":"ContainerStarted","Data":"99ad760dad0f846d00a375655e727e54d33e58c5cadcefcbaa5ae6d3bfebbec0"} Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.948971 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.961139 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" podStartSLOduration=1.9989325409999998 podStartE2EDuration="10.961121709s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.503218843 +0000 UTC m=+995.758116732" lastFinishedPulling="2026-03-10 09:20:18.465408011 +0000 UTC m=+1004.720305900" observedRunningTime="2026-03-10 09:20:18.959661596 +0000 UTC m=+1005.214559484" watchObservedRunningTime="2026-03-10 09:20:18.961121709 +0000 UTC m=+1005.216019588" Mar 10 09:20:18 crc kubenswrapper[4883]: I0310 09:20:18.982319 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" podStartSLOduration=1.8990351159999999 podStartE2EDuration="10.982299752s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.436925111 +0000 UTC m=+995.691823000" lastFinishedPulling="2026-03-10 09:20:18.520189747 +0000 UTC m=+1004.775087636" observedRunningTime="2026-03-10 09:20:18.981351343 +0000 UTC m=+1005.236249233" watchObservedRunningTime="2026-03-10 09:20:18.982299752 +0000 UTC m=+1005.237197641" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.030515 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" podStartSLOduration=2.112578469 podStartE2EDuration="11.030500218s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.602456003 +0000 UTC m=+995.857353891" lastFinishedPulling="2026-03-10 09:20:18.520377751 +0000 UTC m=+1004.775275640" observedRunningTime="2026-03-10 09:20:19.02878308 +0000 UTC m=+1005.283680959" watchObservedRunningTime="2026-03-10 09:20:19.030500218 +0000 UTC m=+1005.285398107" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.084508 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" podStartSLOduration=2.08306965 podStartE2EDuration="11.084487285s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.514965883 +0000 UTC m=+995.769863773" lastFinishedPulling="2026-03-10 09:20:18.516383519 +0000 UTC m=+1004.771281408" observedRunningTime="2026-03-10 09:20:19.060952928 +0000 UTC m=+1005.315850817" watchObservedRunningTime="2026-03-10 09:20:19.084487285 +0000 UTC m=+1005.339385173" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.099631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" podStartSLOduration=2.181918445 podStartE2EDuration="11.099595993s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.603187251 +0000 UTC m=+995.858085141" lastFinishedPulling="2026-03-10 09:20:18.52086481 +0000 UTC m=+1004.775762689" observedRunningTime="2026-03-10 09:20:19.099210826 +0000 UTC m=+1005.354108716" watchObservedRunningTime="2026-03-10 09:20:19.099595993 +0000 UTC m=+1005.354493882" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.144942 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" podStartSLOduration=2.220559876 podStartE2EDuration="11.14492291s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.591362946 +0000 UTC m=+995.846260835" lastFinishedPulling="2026-03-10 09:20:18.51572598 +0000 UTC m=+1004.770623869" observedRunningTime="2026-03-10 09:20:19.142320423 +0000 UTC m=+1005.397218313" watchObservedRunningTime="2026-03-10 09:20:19.14492291 +0000 UTC m=+1005.399820799" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.224839 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" podStartSLOduration=2.146921735 podStartE2EDuration="11.224810825s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.435531032 +0000 UTC m=+995.690428921" lastFinishedPulling="2026-03-10 09:20:18.513420122 +0000 UTC m=+1004.768318011" observedRunningTime="2026-03-10 09:20:19.22113794 +0000 UTC m=+1005.476035829" watchObservedRunningTime="2026-03-10 09:20:19.224810825 +0000 UTC m=+1005.479708714" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.295967 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" podStartSLOduration=2.363514839 podStartE2EDuration="11.295946377s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.588448641 +0000 UTC m=+995.843346530" lastFinishedPulling="2026-03-10 09:20:18.520880178 +0000 UTC m=+1004.775778068" observedRunningTime="2026-03-10 09:20:19.295280742 +0000 UTC m=+1005.550178631" watchObservedRunningTime="2026-03-10 09:20:19.295946377 +0000 UTC m=+1005.550844267" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.359332 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" podStartSLOduration=2.462614961 podStartE2EDuration="11.359315264s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.585963925 +0000 UTC m=+995.840861815" lastFinishedPulling="2026-03-10 09:20:18.482664239 +0000 UTC m=+1004.737562118" observedRunningTime="2026-03-10 09:20:19.355023683 +0000 UTC m=+1005.609921571" watchObservedRunningTime="2026-03-10 09:20:19.359315264 +0000 UTC m=+1005.614213154" Mar 10 09:20:19 crc kubenswrapper[4883]: I0310 09:20:19.391585 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" podStartSLOduration=2.289595797 podStartE2EDuration="11.391568339s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.441873251 +0000 UTC m=+995.696771139" lastFinishedPulling="2026-03-10 09:20:18.543845791 +0000 UTC m=+1004.798743681" observedRunningTime="2026-03-10 09:20:19.38758028 +0000 UTC m=+1005.642478169" watchObservedRunningTime="2026-03-10 09:20:19.391568339 +0000 UTC m=+1005.646466229" Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.135629 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.135716 4883 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.136341 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert podName:c994e4ad-140c-4655-ad69-e4013406d12e nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.136318563 +0000 UTC m=+1026.391216452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert") pod "infra-operator-controller-manager-5995f4446f-v6p2d" (UID: "c994e4ad-140c-4655-ad69-e4013406d12e") : secret "infra-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.339576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.339833 4883 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.339947 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert podName:2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.339925881 +0000 UTC m=+1026.594823770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert") pod "openstack-baremetal-operator-controller-manager-6647d7885f9f2px" (UID: "2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.747011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:24 crc kubenswrapper[4883]: I0310 09:20:24.747306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747218 4883 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747532 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.747517517 +0000 UTC m=+1027.002415406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "metrics-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747465 4883 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 09:20:24 crc kubenswrapper[4883]: E0310 09:20:24.747714 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs podName:969b2d39-fb99-42df-8e6e-3ded5cd292c8 nodeName:}" failed. No retries permitted until 2026-03-10 09:20:40.747689862 +0000 UTC m=+1027.002587750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs") pod "openstack-operator-controller-manager-6679ddfdc7-9ntl4" (UID: "969b2d39-fb99-42df-8e6e-3ded5cd292c8") : secret "webhook-server-cert" not found Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.424972 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-q52nj" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.437827 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-nzdsk" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.439786 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h2cxw" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.591229 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-txdwh" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.608521 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v5kxw" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.771328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-w9dbp" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.799822 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-mbxnn" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.813091 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fvwbt" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.817669 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-rpwdx" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.879288 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-pppd9" Mar 10 09:20:28 crc kubenswrapper[4883]: I0310 09:20:28.944433 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-8mpp4" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.029921 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" event={"ID":"8b177c77-d85f-4374-b6db-a700719c1282","Type":"ContainerStarted","Data":"2b22b77f345de2ec28206c76d238c648870af6a448d35083baa444304148a8de"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.030430 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.033051 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" event={"ID":"475c1190-6d94-431a-943d-4e749ea87d6b","Type":"ContainerStarted","Data":"da557b399e4261a9114cb2eb0f95fabbaed65c94b06d9fced6cd9a82ebc3bf15"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.034919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" event={"ID":"1b429bd6-00de-4cc2-8a18-9f58897b6834","Type":"ContainerStarted","Data":"0faabe56de5bb1f2b247e7d90820445910d2a35d205010b8244c27c233669ee3"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.035078 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.036259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" event={"ID":"3f4c2998-b51a-4620-b674-60bb0817eb7d","Type":"ContainerStarted","Data":"39c59132c7a776ad64e1758acc76c10b7a6c76a512d094556ee28d5932c7ca7b"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.036468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.037816 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" event={"ID":"91415f40-08a2-451b-abe8-38c7b447e66f","Type":"ContainerStarted","Data":"f401e6cdb7f542da8d4084cf2814b23a1ee7684a0ebd7b034112835f6dc2e47d"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.037991 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.039443 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" event={"ID":"ec624ec4-966f-410c-95c7-73be0f9cad27","Type":"ContainerStarted","Data":"d3cd4289e8a33c51a2b626dff092b53667b2ae2fa77c8ee6e9a28239738665cf"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.039657 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.041193 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" event={"ID":"a7216675-a296-4faa-9dd5-d857b15ffa3c","Type":"ContainerStarted","Data":"582f2959b40ee58a20fdefb5b62254d9b58c00ad09eb1a6157268c8ab23b7988"} Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.041377 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.042749 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" podStartSLOduration=2.959986144 podStartE2EDuration="22.042730791s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.623028764 +0000 UTC m=+995.877926654" lastFinishedPulling="2026-03-10 09:20:28.705773412 +0000 UTC m=+1014.960671301" observedRunningTime="2026-03-10 09:20:30.040925968 +0000 UTC m=+1016.295823858" watchObservedRunningTime="2026-03-10 09:20:30.042730791 +0000 UTC m=+1016.297628680" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.053768 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" podStartSLOduration=3.553471607 podStartE2EDuration="22.053752323s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.627308985 +0000 UTC m=+995.882206874" lastFinishedPulling="2026-03-10 09:20:28.127589701 +0000 UTC m=+1014.382487590" observedRunningTime="2026-03-10 09:20:30.051833064 +0000 UTC m=+1016.306730954" watchObservedRunningTime="2026-03-10 09:20:30.053752323 +0000 UTC m=+1016.308650212" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.064696 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" podStartSLOduration=2.590838335 podStartE2EDuration="22.064682402s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.789275826 +0000 UTC m=+996.044173715" lastFinishedPulling="2026-03-10 09:20:29.263119893 +0000 UTC m=+1015.518017782" observedRunningTime="2026-03-10 09:20:30.061115576 +0000 UTC m=+1016.316013466" watchObservedRunningTime="2026-03-10 09:20:30.064682402 +0000 UTC m=+1016.319580292" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.081235 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" podStartSLOduration=2.3919529219999998 podStartE2EDuration="22.081175021s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.611264071 +0000 UTC m=+995.866161960" lastFinishedPulling="2026-03-10 09:20:29.30048617 +0000 UTC m=+1015.555384059" observedRunningTime="2026-03-10 09:20:30.078051161 +0000 UTC m=+1016.332949049" watchObservedRunningTime="2026-03-10 09:20:30.081175021 +0000 UTC m=+1016.336072910" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.097112 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjjsn" podStartSLOduration=2.614934079 podStartE2EDuration="22.09710094s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.818350128 +0000 UTC m=+996.073248017" lastFinishedPulling="2026-03-10 09:20:29.300516989 +0000 UTC m=+1015.555414878" observedRunningTime="2026-03-10 09:20:30.093509188 +0000 UTC m=+1016.348407076" watchObservedRunningTime="2026-03-10 09:20:30.09710094 +0000 UTC m=+1016.351998829" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.134621 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" podStartSLOduration=3.234709236 podStartE2EDuration="22.134601842s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.806020199 +0000 UTC m=+996.060918088" lastFinishedPulling="2026-03-10 09:20:28.705912805 +0000 UTC m=+1014.960810694" observedRunningTime="2026-03-10 09:20:30.133465539 +0000 UTC m=+1016.388363428" watchObservedRunningTime="2026-03-10 09:20:30.134601842 +0000 UTC m=+1016.389499731" Mar 10 09:20:30 crc kubenswrapper[4883]: I0310 09:20:30.135131 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" podStartSLOduration=2.6768084979999998 podStartE2EDuration="22.13512627s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.807041725 +0000 UTC m=+996.061939614" lastFinishedPulling="2026-03-10 09:20:29.265359496 +0000 UTC m=+1015.520257386" observedRunningTime="2026-03-10 09:20:30.117053072 +0000 UTC m=+1016.371950961" watchObservedRunningTime="2026-03-10 09:20:30.13512627 +0000 UTC m=+1016.390024160" Mar 10 09:20:34 crc kubenswrapper[4883]: I0310 09:20:34.589182 4883 scope.go:117] "RemoveContainer" containerID="3e7da8f0c03e771b080917bc83392de1ddb5243f6ec147ddb91205eab0cfd88f" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.688215 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-kz9sv" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.690439 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-dgrlb" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.737907 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-snvh5" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.902332 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m6wph" Mar 10 09:20:38 crc kubenswrapper[4883]: I0310 09:20:38.923073 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mkjnt" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.123538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" event={"ID":"d0e08342-2d1b-42d9-921e-1d948f701a58","Type":"ContainerStarted","Data":"a026ed0e71110320753d42e030185f6f7a0fdc8887cc65f0a34dde9e777bf6de"} Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.123747 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.125691 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" event={"ID":"c13f33e2-dd6a-4ca0-91e7-5489c753e273","Type":"ContainerStarted","Data":"a113883fa844ed32d6c3d17f9729f9be9ecd130d7b96f7c819adafd130b87ad9"} Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.125886 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.205118 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-rkjsw" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.229150 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" podStartSLOduration=2.833061728 podStartE2EDuration="31.22913217s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.789338194 +0000 UTC m=+996.044236073" lastFinishedPulling="2026-03-10 09:20:38.185408626 +0000 UTC m=+1024.440306515" observedRunningTime="2026-03-10 09:20:39.138354661 +0000 UTC m=+1025.393252550" watchObservedRunningTime="2026-03-10 09:20:39.22913217 +0000 UTC m=+1025.484030058" Mar 10 09:20:39 crc kubenswrapper[4883]: I0310 09:20:39.230050 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" podStartSLOduration=2.85096265 podStartE2EDuration="31.230043437s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:09.803198368 +0000 UTC m=+996.058096257" lastFinishedPulling="2026-03-10 09:20:38.182279145 +0000 UTC m=+1024.437177044" observedRunningTime="2026-03-10 09:20:39.224844745 +0000 UTC m=+1025.479742635" watchObservedRunningTime="2026-03-10 09:20:39.230043437 +0000 UTC m=+1025.484941327" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.178755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.185604 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c994e4ad-140c-4655-ad69-e4013406d12e-cert\") pod \"infra-operator-controller-manager-5995f4446f-v6p2d\" (UID: \"c994e4ad-140c-4655-ad69-e4013406d12e\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.362053 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4ld57" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.370805 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.380731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.385209 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f-cert\") pod \"openstack-baremetal-operator-controller-manager-6647d7885f9f2px\" (UID: \"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.668319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cggl5" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.676847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.774451 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d"] Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.788635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.788689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.795520 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-metrics-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.795921 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/969b2d39-fb99-42df-8e6e-3ded5cd292c8-webhook-certs\") pod \"openstack-operator-controller-manager-6679ddfdc7-9ntl4\" (UID: \"969b2d39-fb99-42df-8e6e-3ded5cd292c8\") " pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:40 crc kubenswrapper[4883]: I0310 09:20:40.873805 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px"] Mar 10 09:20:40 crc kubenswrapper[4883]: W0310 09:20:40.878050 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a2580ec_7e99_4eb0_95e2_9e6ca33a6a5f.slice/crio-bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666 WatchSource:0}: Error finding container bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666: Status 404 returned error can't find the container with id bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666 Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.010275 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rpzgd" Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.018615 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.145808 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" event={"ID":"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f","Type":"ContainerStarted","Data":"bb77f866bd704e415da5ad71f185c149c2da85a03c08e79218770975bdc6b666"} Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.147805 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" event={"ID":"c994e4ad-140c-4655-ad69-e4013406d12e","Type":"ContainerStarted","Data":"e6b6002ce0e0bc69c96acf006c9fd4d004a12051d206f6b2293c00069f834e1f"} Mar 10 09:20:41 crc kubenswrapper[4883]: I0310 09:20:41.412098 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4"] Mar 10 09:20:41 crc kubenswrapper[4883]: W0310 09:20:41.416064 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969b2d39_fb99_42df_8e6e_3ded5cd292c8.slice/crio-4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b WatchSource:0}: Error finding container 4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b: Status 404 returned error can't find the container with id 4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" event={"ID":"969b2d39-fb99-42df-8e6e-3ded5cd292c8","Type":"ContainerStarted","Data":"159f95dc45d61cdf552b5808596cfacbb5b9148438d095c3d9091b58f2bea9b0"} Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162728 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" event={"ID":"969b2d39-fb99-42df-8e6e-3ded5cd292c8","Type":"ContainerStarted","Data":"4ab786620574c94df8159751f4aebe381decf94fa2054c0bcc2e0632eeba008b"} Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.162752 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:20:42 crc kubenswrapper[4883]: I0310 09:20:42.195340 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" podStartSLOduration=34.195325652 podStartE2EDuration="34.195325652s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:20:42.184973522 +0000 UTC m=+1028.439871411" watchObservedRunningTime="2026-03-10 09:20:42.195325652 +0000 UTC m=+1028.450223542" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.192532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" event={"ID":"2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f","Type":"ContainerStarted","Data":"aabf5d387c6030614d20fd997892a9b068b90ddfeb567493b08ff36ea990cd9d"} Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.193097 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.194793 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" event={"ID":"c994e4ad-140c-4655-ad69-e4013406d12e","Type":"ContainerStarted","Data":"08ed32918672484b4be49c63cf650cd0b50a264e2e1f43db593f8b55c6995c80"} Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.194875 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.214087 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" podStartSLOduration=34.025012163 podStartE2EDuration="37.214076747s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:40.880380629 +0000 UTC m=+1027.135278519" lastFinishedPulling="2026-03-10 09:20:44.069445223 +0000 UTC m=+1030.324343103" observedRunningTime="2026-03-10 09:20:45.213573508 +0000 UTC m=+1031.468471396" watchObservedRunningTime="2026-03-10 09:20:45.214076747 +0000 UTC m=+1031.468974636" Mar 10 09:20:45 crc kubenswrapper[4883]: I0310 09:20:45.227491 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" podStartSLOduration=33.935478211 podStartE2EDuration="37.227448681s" podCreationTimestamp="2026-03-10 09:20:08 +0000 UTC" firstStartedPulling="2026-03-10 09:20:40.78197639 +0000 UTC m=+1027.036874280" lastFinishedPulling="2026-03-10 09:20:44.073946861 +0000 UTC m=+1030.328844750" observedRunningTime="2026-03-10 09:20:45.226527584 +0000 UTC m=+1031.481425473" watchObservedRunningTime="2026-03-10 09:20:45.227448681 +0000 UTC m=+1031.482346570" Mar 10 09:20:48 crc kubenswrapper[4883]: I0310 09:20:48.805036 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-49gjk" Mar 10 09:20:48 crc kubenswrapper[4883]: I0310 09:20:48.844575 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-qnwgj" Mar 10 09:20:50 crc kubenswrapper[4883]: I0310 09:20:50.377175 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-v6p2d" Mar 10 09:20:50 crc kubenswrapper[4883]: I0310 09:20:50.685350 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6647d7885f9f2px" Mar 10 09:20:51 crc kubenswrapper[4883]: I0310 09:20:51.026283 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6679ddfdc7-9ntl4" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.446869 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.448609 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.453987 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454140 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454233 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.454323 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zrbkq" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.459490 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.495359 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.496701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.498063 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.510430 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540725 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540802 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.540884 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.541079 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.541143 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642562 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.642804 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643892 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.643816 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.644728 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.661886 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"dnsmasq-dns-86bbd886cf-454p7\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.661945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"dnsmasq-dns-589db6c89c-56s6s\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.766109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:05 crc kubenswrapper[4883]: I0310 09:21:05.809088 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.153615 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.195672 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.358334 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerStarted","Data":"83fd7b7c14ede20be946470ce0f3534de6b73ac0b442f2cc761c12340562dfa2"} Mar 10 09:21:06 crc kubenswrapper[4883]: I0310 09:21:06.360141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" event={"ID":"b1f1ef1a-4311-492d-b626-484f3b8ae836","Type":"ContainerStarted","Data":"b4c4edf66ac859048fbaea168e4fb72023b5242b17e67fe3301372d7bb2750e3"} Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.193205 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.217291 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.218428 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.235509 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289563 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.289707 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391231 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.391354 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.392184 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.392867 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.417267 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"dnsmasq-dns-78cb4465c9-wcttq\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.449911 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.472656 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.491889 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.526487 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.553504 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596176 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.596219 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698455 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.698520 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.699831 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.700923 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.715342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"dnsmasq-dns-7c47bcb9f9-ckpnm\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:08 crc kubenswrapper[4883]: I0310 09:21:08.823099 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.095863 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:09 crc kubenswrapper[4883]: W0310 09:21:09.107068 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8010fd0c_6a0f_4078_851d_aff31b9efa90.slice/crio-27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d WatchSource:0}: Error finding container 27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d: Status 404 returned error can't find the container with id 27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.232987 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:09 crc kubenswrapper[4883]: W0310 09:21:09.239533 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b114327_1a63_488a_aace_0488259b1278.slice/crio-0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c WatchSource:0}: Error finding container 0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c: Status 404 returned error can't find the container with id 0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.365444 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.366897 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370525 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjf6k" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370622 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370537 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370739 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.370879 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.371650 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.371707 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.379495 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.397519 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerStarted","Data":"27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d"} Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.400661 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerStarted","Data":"0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c"} Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409755 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409848 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409882 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.409960 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410066 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410111 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410285 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410334 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.410425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.511546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.512704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.512645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513813 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.513897 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514034 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514082 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.514196 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.515098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.515394 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.516147 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.520901 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.521111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.521752 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.524437 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.527540 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.535495 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.632643 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.634168 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.636920 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637213 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637219 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637357 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x4lhh" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637489 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637602 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.637698 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.644202 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.696929 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.820783 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821103 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821174 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821216 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821239 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821270 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821304 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821323 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821345 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.821379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923026 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923085 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923163 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923211 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923235 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923257 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923275 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.923438 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.924203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.925035 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.925453 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.926595 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.926635 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.927375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.930006 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.933619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.938935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.939613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.942092 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.944621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " pod="openstack/rabbitmq-server-0" Mar 10 09:21:09 crc kubenswrapper[4883]: I0310 09:21:09.972137 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:21:10 crc kubenswrapper[4883]: W0310 09:21:10.204141 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa2bcd6_6a54_472f_bd1c_276e6f8caa07.slice/crio-010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6 WatchSource:0}: Error finding container 010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6: Status 404 returned error can't find the container with id 010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6 Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.206688 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.389614 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: W0310 09:21:10.404058 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb6ba72_d1c8_4022_9029_2e18784e1139.slice/crio-531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848 WatchSource:0}: Error finding container 531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848: Status 404 returned error can't find the container with id 531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848 Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.438817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848"} Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.441065 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6"} Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.757421 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.758866 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.760240 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.762443 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.764262 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hmvxb" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.769225 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.776762 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.784730 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942900 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.942975 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943043 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943260 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943349 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:10 crc kubenswrapper[4883]: I0310 09:21:10.943379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044756 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.044946 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045066 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.045347 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.046015 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5dae6834-0ed6-4043-9efe-91745925591a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.047705 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-config-data-default\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.047877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-kolla-config\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.048570 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dae6834-0ed6-4043-9efe-91745925591a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.052469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.061056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5ms\" (UniqueName: \"kubernetes.io/projected/5dae6834-0ed6-4043-9efe-91745925591a-kube-api-access-gh5ms\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.063113 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dae6834-0ed6-4043-9efe-91745925591a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.065741 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"5dae6834-0ed6-4043-9efe-91745925591a\") " pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.079968 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 09:21:11 crc kubenswrapper[4883]: I0310 09:21:11.577683 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.246784 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.250318 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252945 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.252970 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-66v9q" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.258063 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.258429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365245 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365311 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365451 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365510 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365550 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365814 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.365862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.468943 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469023 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469142 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469201 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.469249 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.471780 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472137 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472744 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/287f174d-514a-4c8c-a70e-b6e64fe41653-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.472761 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/287f174d-514a-4c8c-a70e-b6e64fe41653-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.491054 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.491095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287f174d-514a-4c8c-a70e-b6e64fe41653-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.499583 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.508212 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xng5m\" (UniqueName: \"kubernetes.io/projected/287f174d-514a-4c8c-a70e-b6e64fe41653-kube-api-access-xng5m\") pod \"openstack-cell1-galera-0\" (UID: \"287f174d-514a-4c8c-a70e-b6e64fe41653\") " pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.577714 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.654669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.656768 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664286 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7vzv4" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664707 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.664921 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.685412 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774373 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774551 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774680 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.774794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877163 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877265 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877336 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.877426 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.878898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-config-data\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.880288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52bdcacc-ce19-418b-871c-35482038da29-kolla-config\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.889764 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.889782 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bdcacc-ce19-418b-871c-35482038da29-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.902253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp89f\" (UniqueName: \"kubernetes.io/projected/52bdcacc-ce19-418b-871c-35482038da29-kube-api-access-cp89f\") pod \"memcached-0\" (UID: \"52bdcacc-ce19-418b-871c-35482038da29\") " pod="openstack/memcached-0" Mar 10 09:21:12 crc kubenswrapper[4883]: I0310 09:21:12.984155 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.883175 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.884697 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.887248 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8zs8c" Mar 10 09:21:14 crc kubenswrapper[4883]: I0310 09:21:14.893826 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.037600 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.139930 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.156368 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"kube-state-metrics-0\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.201089 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:21:15 crc kubenswrapper[4883]: I0310 09:21:15.549656 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"95ee7666b24ae09d3b2e9bf6236b7e8d99bea51fde83eeeb876963a2df97ba11"} Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.453489 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.456949 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461302 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461360 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.461869 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.462057 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jnn7z" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.465148 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.476113 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.477350 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.479126 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.479829 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rp9zb" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.482656 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.495798 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.497641 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.504459 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.516709 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601692 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601728 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601804 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601822 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601896 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.601980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602018 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602048 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602113 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602141 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602165 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602186 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602226 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.602287 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705361 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705442 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705461 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.705990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706098 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706170 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706235 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706256 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706326 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706389 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706450 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706523 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706551 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.706972 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.707571 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.707748 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.709615 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-run\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.709833 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28145780-82a1-453f-be56-b22c635f027e-scripts\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-log\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710105 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6691939e-adb0-420c-bf9e-f4a9b670c83b-scripts\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.710262 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-etc-ovs\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.711139 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.712399 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/28145780-82a1-453f-be56-b22c635f027e-var-lib\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.713372 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-log-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.715264 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.718837 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-config\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.721809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6691939e-adb0-420c-bf9e-f4a9b670c83b-var-run-ovn\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.728610 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.728673 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.729632 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-combined-ca-bundle\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.732328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6691939e-adb0-420c-bf9e-f4a9b670c83b-ovn-controller-tls-certs\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.735185 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27tl\" (UniqueName: \"kubernetes.io/projected/be383ddb-b33d-4129-acf8-1ffbbc21b1d4-kube-api-access-j27tl\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.736295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99k66\" (UniqueName: \"kubernetes.io/projected/28145780-82a1-453f-be56-b22c635f027e-kube-api-access-99k66\") pod \"ovn-controller-ovs-qrl4s\" (UID: \"28145780-82a1-453f-be56-b22c635f027e\") " pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.742707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6r6\" (UniqueName: \"kubernetes.io/projected/6691939e-adb0-420c-bf9e-f4a9b670c83b-kube-api-access-bg6r6\") pod \"ovn-controller-lb2z9\" (UID: \"6691939e-adb0-420c-bf9e-f4a9b670c83b\") " pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.759987 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"be383ddb-b33d-4129-acf8-1ffbbc21b1d4\") " pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.810265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.817602 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:18 crc kubenswrapper[4883]: I0310 09:21:18.825808 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.224377 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.230521 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234147 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234728 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234764 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.234873 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2swnw" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.256724 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389866 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389913 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.389990 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390009 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390057 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.390087 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493267 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493416 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493564 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493599 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493626 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493686 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.493777 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495188 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495351 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495520 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249a9bf5-ef0f-4209-855e-3fa422106519-config\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.495665 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.502198 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.503076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.511290 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4tk\" (UniqueName: \"kubernetes.io/projected/249a9bf5-ef0f-4209-855e-3fa422106519-kube-api-access-9q4tk\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.511952 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249a9bf5-ef0f-4209-855e-3fa422106519-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.516228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249a9bf5-ef0f-4209-855e-3fa422106519\") " pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: I0310 09:21:22.566830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.731959 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.732260 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljglh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-454p7_openstack(b1f1ef1a-4311-492d-b626-484f3b8ae836): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:21:22 crc kubenswrapper[4883]: E0310 09:21:22.733490 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" podUID="b1f1ef1a-4311-492d-b626-484f3b8ae836" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.848007 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864330 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864634 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.864783 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") pod \"b1f1ef1a-4311-492d-b626-484f3b8ae836\" (UID: \"b1f1ef1a-4311-492d-b626-484f3b8ae836\") " Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.866562 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config" (OuterVolumeSpecName: "config") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.866856 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.874230 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh" (OuterVolumeSpecName: "kube-api-access-ljglh") pod "b1f1ef1a-4311-492d-b626-484f3b8ae836" (UID: "b1f1ef1a-4311-492d-b626-484f3b8ae836"). InnerVolumeSpecName "kube-api-access-ljglh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970129 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970165 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1f1ef1a-4311-492d-b626-484f3b8ae836-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:25 crc kubenswrapper[4883]: I0310 09:21:25.970180 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljglh\" (UniqueName: \"kubernetes.io/projected/b1f1ef1a-4311-492d-b626-484f3b8ae836-kube-api-access-ljglh\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.237040 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.346663 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.360601 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.367058 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.454355 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.470410 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5094e588_6ef7_4214_a96e_26d75ad98977.slice/crio-8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd WatchSource:0}: Error finding container 8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd: Status 404 returned error can't find the container with id 8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.472507 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod287f174d_514a_4c8c_a70e_b6e64fe41653.slice/crio-cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7 WatchSource:0}: Error finding container cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7: Status 404 returned error can't find the container with id cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7 Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.474441 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe383ddb_b33d_4129_acf8_1ffbbc21b1d4.slice/crio-21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10 WatchSource:0}: Error finding container 21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10: Status 404 returned error can't find the container with id 21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.536240 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 09:21:26 crc kubenswrapper[4883]: W0310 09:21:26.540088 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249a9bf5_ef0f_4209_855e_3fa422106519.slice/crio-62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375 WatchSource:0}: Error finding container 62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375: Status 404 returned error can't find the container with id 62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.654689 4883 generic.go:334] "Generic (PLEG): container finished" podID="6b114327-1a63-488a-aace-0488259b1278" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.654803 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.660740 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"21b6096c13b9eb852df5fe497b5e2eb1956343d65d21a4daf84a5660fbacbb10"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.665256 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.671183 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"62849294d0207d516f219fbf4ddbd471c56ac1787264d9d45691377b27b19375"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.678614 4883 generic.go:334] "Generic (PLEG): container finished" podID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.678867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.682486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"cf77515b6a8e23e44ad6eba001ccc79a5abfefcb2792c111c292972a2cf06ea7"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.684108 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9" event={"ID":"6691939e-adb0-420c-bf9e-f4a9b670c83b","Type":"ContainerStarted","Data":"4840599676e1d0ad84cee1bd1bb0b41f3c4212049cf840499f26fa38bb5474a4"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.685515 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52bdcacc-ce19-418b-871c-35482038da29","Type":"ContainerStarted","Data":"3e7f41d3bfe364c680b02181ceb7d66bd5258c6231ec38db4d490c2f4cfc10fc"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.689289 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerStarted","Data":"8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.700941 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.700925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-454p7" event={"ID":"b1f1ef1a-4311-492d-b626-484f3b8ae836","Type":"ContainerDied","Data":"b4c4edf66ac859048fbaea168e4fb72023b5242b17e67fe3301372d7bb2750e3"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.702713 4883 generic.go:334] "Generic (PLEG): container finished" podID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerID="e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d" exitCode=0 Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.702824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerDied","Data":"e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d"} Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.760516 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:26 crc kubenswrapper[4883]: I0310 09:21:26.762348 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-454p7"] Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.942339 4883 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 10 09:21:26 crc kubenswrapper[4883]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 09:21:26 crc kubenswrapper[4883]: > podSandboxID="27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d" Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.942617 4883 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 09:21:26 crc kubenswrapper[4883]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zh9gd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-wcttq_openstack(8010fd0c-6a0f-4078-851d-aff31b9efa90): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 10 09:21:26 crc kubenswrapper[4883]: > logger="UnhandledError" Mar 10 09:21:26 crc kubenswrapper[4883]: E0310 09:21:26.943928 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.105234 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.258803 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qrl4s"] Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.292930 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") pod \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.292984 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") pod \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\" (UID: \"f3a815ae-f56c-4ad8-a4cd-b202012bf94a\") " Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.300426 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8" (OuterVolumeSpecName: "kube-api-access-96nm8") pod "f3a815ae-f56c-4ad8-a4cd-b202012bf94a" (UID: "f3a815ae-f56c-4ad8-a4cd-b202012bf94a"). InnerVolumeSpecName "kube-api-access-96nm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.318165 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config" (OuterVolumeSpecName: "config") pod "f3a815ae-f56c-4ad8-a4cd-b202012bf94a" (UID: "f3a815ae-f56c-4ad8-a4cd-b202012bf94a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.396052 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.396099 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96nm8\" (UniqueName: \"kubernetes.io/projected/f3a815ae-f56c-4ad8-a4cd-b202012bf94a-kube-api-access-96nm8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715345 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-56s6s" event={"ID":"f3a815ae-f56c-4ad8-a4cd-b202012bf94a","Type":"ContainerDied","Data":"83fd7b7c14ede20be946470ce0f3534de6b73ac0b442f2cc761c12340562dfa2"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.715829 4883 scope.go:117] "RemoveContainer" containerID="e00ff4ed38f4346987a643fce8da97d41c7e1cf34fa07996eca6ab319b7d076d" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.718439 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.721004 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.725724 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerStarted","Data":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.725830 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.727675 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.729973 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"85b1226f3c138a389d93578b856a79c83ff2666be61efa138303092bb74abdff"} Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.785533 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" podStartSLOduration=3.123866549 podStartE2EDuration="19.785514705s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:09.241689685 +0000 UTC m=+1055.496587564" lastFinishedPulling="2026-03-10 09:21:25.90333783 +0000 UTC m=+1072.158235720" observedRunningTime="2026-03-10 09:21:27.779778799 +0000 UTC m=+1074.034676689" watchObservedRunningTime="2026-03-10 09:21:27.785514705 +0000 UTC m=+1074.040412594" Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.840061 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:27 crc kubenswrapper[4883]: I0310 09:21:27.845717 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-56s6s"] Mar 10 09:21:28 crc kubenswrapper[4883]: I0310 09:21:28.096385 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f1ef1a-4311-492d-b626-484f3b8ae836" path="/var/lib/kubelet/pods/b1f1ef1a-4311-492d-b626-484f3b8ae836/volumes" Mar 10 09:21:28 crc kubenswrapper[4883]: I0310 09:21:28.096871 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" path="/var/lib/kubelet/pods/f3a815ae-f56c-4ad8-a4cd-b202012bf94a/volumes" Mar 10 09:21:29 crc kubenswrapper[4883]: I0310 09:21:29.750452 4883 generic.go:334] "Generic (PLEG): container finished" podID="5dae6834-0ed6-4043-9efe-91745925591a" containerID="a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982" exitCode=0 Mar 10 09:21:29 crc kubenswrapper[4883]: I0310 09:21:29.750525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerDied","Data":"a3b53efd4d291b31d42168aec7ece633e3c99bad85e93af9f7b973b657782982"} Mar 10 09:21:30 crc kubenswrapper[4883]: I0310 09:21:30.760043 4883 generic.go:334] "Generic (PLEG): container finished" podID="287f174d-514a-4c8c-a70e-b6e64fe41653" containerID="e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65" exitCode=0 Mar 10 09:21:30 crc kubenswrapper[4883]: I0310 09:21:30.760104 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerDied","Data":"e9d6f667a090f843d59fd481c9c48cbf57a1b7366eea84e3bc032122f390cd65"} Mar 10 09:21:33 crc kubenswrapper[4883]: I0310 09:21:33.825573 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:33 crc kubenswrapper[4883]: I0310 09:21:33.869971 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.799911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"287f174d-514a-4c8c-a70e-b6e64fe41653","Type":"ContainerStarted","Data":"757d30cd9b3c92fad097c14d32721d80969bb9767347c47720ed6d97b22675e4"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.804099 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"8d9ad6373ae411b2ac0f01369623f779183f9c96c307b2b12b9353ccddafc6fe"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.806836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9" event={"ID":"6691939e-adb0-420c-bf9e-f4a9b670c83b","Type":"ContainerStarted","Data":"0c7f3445ef1622d3d2fec3d8e8194d8e369d0132dbeadfb718f36b61be9f6b4e"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.807318 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lb2z9" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.809179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"54303e3827b4f36fd621f8f6ec606578a5f34bd241b06b9411ec5d61c45035a7"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.811910 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.814249 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"5dae6834-0ed6-4043-9efe-91745925591a","Type":"ContainerStarted","Data":"d58644c8d73a34a611347de240574a01b69baf93c0a1009a92d6ebb3b29ef3f4"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.817737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerStarted","Data":"c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.817889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.823380 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerStarted","Data":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.823784 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" containerID="cri-o://768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" gracePeriod=10 Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.824507 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.825506 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.825463676 podStartE2EDuration="23.825463676s" podCreationTimestamp="2026-03-10 09:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:34.822688384 +0000 UTC m=+1081.077586272" watchObservedRunningTime="2026-03-10 09:21:34.825463676 +0000 UTC m=+1081.080361566" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.831169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52bdcacc-ce19-418b-871c-35482038da29","Type":"ContainerStarted","Data":"81e78ce75df4af1031f82d7356f973031ac1435a97eb2cebb5c3769550d451b6"} Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.832202 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.844198 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lb2z9" podStartSLOduration=9.067883535 podStartE2EDuration="16.844176481s" podCreationTimestamp="2026-03-10 09:21:18 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.36142475 +0000 UTC m=+1072.616322640" lastFinishedPulling="2026-03-10 09:21:34.137717697 +0000 UTC m=+1080.392615586" observedRunningTime="2026-03-10 09:21:34.836045339 +0000 UTC m=+1081.090943229" watchObservedRunningTime="2026-03-10 09:21:34.844176481 +0000 UTC m=+1081.099074370" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.849663 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.204528349 podStartE2EDuration="20.849646455s" podCreationTimestamp="2026-03-10 09:21:14 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.473058445 +0000 UTC m=+1072.727956334" lastFinishedPulling="2026-03-10 09:21:34.118176551 +0000 UTC m=+1080.373074440" observedRunningTime="2026-03-10 09:21:34.847865227 +0000 UTC m=+1081.102763116" watchObservedRunningTime="2026-03-10 09:21:34.849646455 +0000 UTC m=+1081.104544344" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.871919 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=15.177894914 podStartE2EDuration="25.87190174s" podCreationTimestamp="2026-03-10 09:21:09 +0000 UTC" firstStartedPulling="2026-03-10 09:21:15.124622922 +0000 UTC m=+1061.379520810" lastFinishedPulling="2026-03-10 09:21:25.818629747 +0000 UTC m=+1072.073527636" observedRunningTime="2026-03-10 09:21:34.865698653 +0000 UTC m=+1081.120596542" watchObservedRunningTime="2026-03-10 09:21:34.87190174 +0000 UTC m=+1081.126799628" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.895092 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" podStartSLOduration=10.209000871 podStartE2EDuration="26.895070696s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:09.110310048 +0000 UTC m=+1055.365207937" lastFinishedPulling="2026-03-10 09:21:25.796379873 +0000 UTC m=+1072.051277762" observedRunningTime="2026-03-10 09:21:34.891681055 +0000 UTC m=+1081.146578944" watchObservedRunningTime="2026-03-10 09:21:34.895070696 +0000 UTC m=+1081.149968585" Mar 10 09:21:34 crc kubenswrapper[4883]: I0310 09:21:34.910216 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.083856854 podStartE2EDuration="22.910200876s" podCreationTimestamp="2026-03-10 09:21:12 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.290952059 +0000 UTC m=+1072.545849948" lastFinishedPulling="2026-03-10 09:21:34.117296081 +0000 UTC m=+1080.372193970" observedRunningTime="2026-03-10 09:21:34.907428708 +0000 UTC m=+1081.162326597" watchObservedRunningTime="2026-03-10 09:21:34.910200876 +0000 UTC m=+1081.165098764" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.208533 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348223 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.348374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") pod \"8010fd0c-6a0f-4078-851d-aff31b9efa90\" (UID: \"8010fd0c-6a0f-4078-851d-aff31b9efa90\") " Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.355098 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd" (OuterVolumeSpecName: "kube-api-access-zh9gd") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "kube-api-access-zh9gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.392181 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.403714 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config" (OuterVolumeSpecName: "config") pod "8010fd0c-6a0f-4078-851d-aff31b9efa90" (UID: "8010fd0c-6a0f-4078-851d-aff31b9efa90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450679 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh9gd\" (UniqueName: \"kubernetes.io/projected/8010fd0c-6a0f-4078-851d-aff31b9efa90-kube-api-access-zh9gd\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450714 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.450725 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8010fd0c-6a0f-4078-851d-aff31b9efa90-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847105 4883 generic.go:334] "Generic (PLEG): container finished" podID="28145780-82a1-453f-be56-b22c635f027e" containerID="ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc" exitCode=0 Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerDied","Data":"ff7519323bfb04da3dfae0be6b11fba29616584669ab746b419f63a7bc1b5efc"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"eb6a1dadbf21b091ec9e4aec4a2d239facf3a09a6ef6e5a7052eb11f03a359b8"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qrl4s" event={"ID":"28145780-82a1-453f-be56-b22c635f027e","Type":"ContainerStarted","Data":"6a52ddd2e9b0d4babaaa81937e6d0f741f9dc8aaa1166943b17dd4fe2b997126"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847489 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.847517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856342 4883 generic.go:334] "Generic (PLEG): container finished" podID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" exitCode=0 Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856390 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856437 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wcttq" event={"ID":"8010fd0c-6a0f-4078-851d-aff31b9efa90","Type":"ContainerDied","Data":"27f91884d19752af35b1407d9296c7623b17d38cd07f5ebd1cd011bdaef8ae8d"} Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.856516 4883 scope.go:117] "RemoveContainer" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.871576 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qrl4s" podStartSLOduration=11.161336447 podStartE2EDuration="17.871563571s" podCreationTimestamp="2026-03-10 09:21:18 +0000 UTC" firstStartedPulling="2026-03-10 09:21:27.426238882 +0000 UTC m=+1073.681136772" lastFinishedPulling="2026-03-10 09:21:34.136466006 +0000 UTC m=+1080.391363896" observedRunningTime="2026-03-10 09:21:35.865749258 +0000 UTC m=+1082.120647147" watchObservedRunningTime="2026-03-10 09:21:35.871563571 +0000 UTC m=+1082.126461460" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.890201 4883 scope.go:117] "RemoveContainer" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.893612 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.901788 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wcttq"] Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916293 4883 scope.go:117] "RemoveContainer" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: E0310 09:21:35.916704 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": container with ID starting with 768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85 not found: ID does not exist" containerID="768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916728 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85"} err="failed to get container status \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": rpc error: code = NotFound desc = could not find container \"768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85\": container with ID starting with 768436eac43ce0b8300397ce9f4a0b55f609d060ff8df7ecf9790cd4967f7a85 not found: ID does not exist" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.916747 4883 scope.go:117] "RemoveContainer" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: E0310 09:21:35.917074 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": container with ID starting with a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd not found: ID does not exist" containerID="a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd" Mar 10 09:21:35 crc kubenswrapper[4883]: I0310 09:21:35.917119 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd"} err="failed to get container status \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": rpc error: code = NotFound desc = could not find container \"a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd\": container with ID starting with a2223ee726e62344e6af3d130f67347cbdd7618b31dcff5d063efd09b0249bcd not found: ID does not exist" Mar 10 09:21:36 crc kubenswrapper[4883]: I0310 09:21:36.089024 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" path="/var/lib/kubelet/pods/8010fd0c-6a0f-4078-851d-aff31b9efa90/volumes" Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.886964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249a9bf5-ef0f-4209-855e-3fa422106519","Type":"ContainerStarted","Data":"149b82326646a72af1f60664f1c7e944ac1d163d90fea10e539ffa7351dfffd1"} Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.889174 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"be383ddb-b33d-4129-acf8-1ffbbc21b1d4","Type":"ContainerStarted","Data":"427ce480cdf58a25d715353338cd5927df39a666005d830c3d5b50ef1cdcab10"} Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.910870 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.619463932 podStartE2EDuration="17.910853563s" podCreationTimestamp="2026-03-10 09:21:21 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.541646493 +0000 UTC m=+1072.796544382" lastFinishedPulling="2026-03-10 09:21:37.833036123 +0000 UTC m=+1084.087934013" observedRunningTime="2026-03-10 09:21:38.90550025 +0000 UTC m=+1085.160398138" watchObservedRunningTime="2026-03-10 09:21:38.910853563 +0000 UTC m=+1085.165751453" Mar 10 09:21:38 crc kubenswrapper[4883]: I0310 09:21:38.925951 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.5844231 podStartE2EDuration="21.925936433s" podCreationTimestamp="2026-03-10 09:21:17 +0000 UTC" firstStartedPulling="2026-03-10 09:21:26.476411097 +0000 UTC m=+1072.731308985" lastFinishedPulling="2026-03-10 09:21:37.817924429 +0000 UTC m=+1084.072822318" observedRunningTime="2026-03-10 09:21:38.921005566 +0000 UTC m=+1085.175903454" watchObservedRunningTime="2026-03-10 09:21:38.925936433 +0000 UTC m=+1085.180834323" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.811464 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.843174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.894720 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:39 crc kubenswrapper[4883]: I0310 09:21:39.924331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145206 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145647 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145668 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145715 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145722 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: E0310 09:21:40.145741 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145747 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145936 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="8010fd0c-6a0f-4078-851d-aff31b9efa90" containerName="dnsmasq-dns" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.145959 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a815ae-f56c-4ad8-a4cd-b202012bf94a" containerName="init" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.146842 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.149387 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.154003 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.177941 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.179038 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.185735 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.193737 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271100 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271142 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271428 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271500 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.271903 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374501 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374671 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374778 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374824 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374845 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovn-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374894 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.374920 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375183 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375184 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-ovs-rundir\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.375208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-config\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376195 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.376390 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.386218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.386329 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-combined-ca-bundle\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.390605 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"dnsmasq-dns-6444958b7f-nxp9v\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.393640 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvnns\" (UniqueName: \"kubernetes.io/projected/570aed6d-03dc-4ad5-b0e1-c6efc4facabb-kube-api-access-gvnns\") pod \"ovn-controller-metrics-b2z2p\" (UID: \"570aed6d-03dc-4ad5-b0e1-c6efc4facabb\") " pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.463052 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.495267 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-b2z2p" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.568174 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.584939 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.593436 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.600664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.603385 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.603824 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.631128 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.679873 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680214 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680263 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.680372 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782149 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782281 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782368 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.782396 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783305 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783561 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.783630 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.800410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"dnsmasq-dns-7b57d9888c-qq85c\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.904542 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.921229 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.932538 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:40 crc kubenswrapper[4883]: W0310 09:21:40.936276 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb2ce411_30fc_481a_bfe4_a73537462f13.slice/crio-b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264 WatchSource:0}: Error finding container b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264: Status 404 returned error can't find the container with id b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264 Mar 10 09:21:40 crc kubenswrapper[4883]: I0310 09:21:40.939694 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.009755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-b2z2p"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.074132 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.075591 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.080260 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.080284 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.086122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094093 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-znz9b" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094266 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.094395 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.096232 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.173889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191399 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191468 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191594 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191932 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.191959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.192114 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294148 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294418 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294531 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294586 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.294625 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.295645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-config\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.295751 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b47099e9-f945-4873-a704-ee55b0f0ac46-scripts\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.296310 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.300894 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.301605 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.301825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b47099e9-f945-4873-a704-ee55b0f0ac46-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.310369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxrq\" (UniqueName: \"kubernetes.io/projected/b47099e9-f945-4873-a704-ee55b0f0ac46-kube-api-access-zgxrq\") pod \"ovn-northd-0\" (UID: \"b47099e9-f945-4873-a704-ee55b0f0ac46\") " pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.401119 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:41 crc kubenswrapper[4883]: W0310 09:21:41.402762 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34eb524a_8ba3_4157_8a0c_efd069843d47.slice/crio-217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c WatchSource:0}: Error finding container 217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c: Status 404 returned error can't find the container with id 217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.404622 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.831122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.911348 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerID="f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d" exitCode=0 Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.911434 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerDied","Data":"f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.912846 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerStarted","Data":"b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.916657 4883 generic.go:334] "Generic (PLEG): container finished" podID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" exitCode=0 Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.918138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.918173 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerStarted","Data":"217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.924597 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b2z2p" event={"ID":"570aed6d-03dc-4ad5-b0e1-c6efc4facabb","Type":"ContainerStarted","Data":"72811aa537601afcdfe38454de17e8b1a22c3617c3e3f18a7e9d5f5d5019c053"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.924633 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-b2z2p" event={"ID":"570aed6d-03dc-4ad5-b0e1-c6efc4facabb","Type":"ContainerStarted","Data":"c9dde48ae5937ebc556ac88d14157e6ec7995d79e53bd665fb08c6c831ed287b"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.926901 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"f1b12d1d4dcfaf0291f0b1f6afa72663645f37817f4e3027a8139ee596f05c86"} Mar 10 09:21:41 crc kubenswrapper[4883]: I0310 09:21:41.963529 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-b2z2p" podStartSLOduration=1.963501172 podStartE2EDuration="1.963501172s" podCreationTimestamp="2026-03-10 09:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:41.951962526 +0000 UTC m=+1088.206860415" watchObservedRunningTime="2026-03-10 09:21:41.963501172 +0000 UTC m=+1088.218399061" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.255776 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.269444 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325102 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325246 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.325357 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.331600 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h" (OuterVolumeSpecName: "kube-api-access-jlj5h") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "kube-api-access-jlj5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: E0310 09:21:42.357245 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc podName:eb2ce411-30fc-481a-bfe4-a73537462f13 nodeName:}" failed. No retries permitted until 2026-03-10 09:21:42.857210813 +0000 UTC m=+1089.112108703 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13") : error deleting /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: remove /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: no such file or directory Mar 10 09:21:42 crc kubenswrapper[4883]: E0310 09:21:42.357281 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb podName:eb2ce411-30fc-481a-bfe4-a73537462f13 nodeName:}" failed. No retries permitted until 2026-03-10 09:21:42.85727331 +0000 UTC m=+1089.112171199 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13") : error deleting /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: remove /var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volume-subpaths: no such file or directory Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.357844 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config" (OuterVolumeSpecName: "config") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.428545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlj5h\" (UniqueName: \"kubernetes.io/projected/eb2ce411-30fc-481a-bfe4-a73537462f13-kube-api-access-jlj5h\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.428588 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.578540 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.578846 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.646328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.936619 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.936760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") pod \"eb2ce411-30fc-481a-bfe4-a73537462f13\" (UID: \"eb2ce411-30fc-481a-bfe4-a73537462f13\") " Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937270 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb2ce411-30fc-481a-bfe4-a73537462f13" (UID: "eb2ce411-30fc-481a-bfe4-a73537462f13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.937646 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.938098 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-nxp9v" event={"ID":"eb2ce411-30fc-481a-bfe4-a73537462f13","Type":"ContainerDied","Data":"b9ce573ba7666deddaf4bbfea815e9be85feaa65aa156c0b1b45685812d7e264"} Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.938184 4883 scope.go:117] "RemoveContainer" containerID="f83a160ef6da8cae8c43ffcc9b804f634b417db7bccd27ca82a5481ab275d14d" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.940833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerStarted","Data":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.941672 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.964553 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" podStartSLOduration=2.964526955 podStartE2EDuration="2.964526955s" podCreationTimestamp="2026-03-10 09:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:42.959160527 +0000 UTC m=+1089.214058426" watchObservedRunningTime="2026-03-10 09:21:42.964526955 +0000 UTC m=+1089.219424845" Mar 10 09:21:42 crc kubenswrapper[4883]: I0310 09:21:42.985670 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.009992 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.015310 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-nxp9v"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.020937 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.039033 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.040259 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb2ce411-30fc-481a-bfe4-a73537462f13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.226874 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: E0310 09:21:43.227272 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.227286 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.227488 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" containerName="init" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.228050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.243052 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.244387 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.246260 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.248677 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.257764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349028 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349397 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.349492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.450709 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.451939 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.452119 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.453074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.466218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"glance-db-create-pv8r6\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.468463 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"glance-8903-account-create-update-lxrp4\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.544786 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.563701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.957156 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"2862e53b8919872fce8a435977928cd15d8b5473ec1ce12df00e6ab89bc3fc6b"} Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.957648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b47099e9-f945-4873-a704-ee55b0f0ac46","Type":"ContainerStarted","Data":"785ae0bad3e887c65aef7d320008a437a53cfb493cc40ce86aa9113ab88165d4"} Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.958328 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.972356 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.982264 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.984275 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.991712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:43 crc kubenswrapper[4883]: I0310 09:21:43.992908 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.378284609 podStartE2EDuration="2.992886499s" podCreationTimestamp="2026-03-10 09:21:41 +0000 UTC" firstStartedPulling="2026-03-10 09:21:41.836034263 +0000 UTC m=+1088.090932152" lastFinishedPulling="2026-03-10 09:21:43.450636163 +0000 UTC m=+1089.705534042" observedRunningTime="2026-03-10 09:21:43.976786682 +0000 UTC m=+1090.231684570" watchObservedRunningTime="2026-03-10 09:21:43.992886499 +0000 UTC m=+1090.247784388" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.037326 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.066094 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.066488 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.100493 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2ce411-30fc-481a-bfe4-a73537462f13" path="/var/lib/kubelet/pods/eb2ce411-30fc-481a-bfe4-a73537462f13/volumes" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.103925 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.105454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.108041 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.118341 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168718 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168805 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.168984 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.171376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.185322 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"keystone-db-create-mj8nd\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.193223 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.194557 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.200968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271430 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271467 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.271860 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.272405 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.290283 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"keystone-d500-account-create-update-fpfdr\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.298315 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.299323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.300882 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.301594 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.320752 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.375895 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.376602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.395272 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"placement-db-create-j9kwf\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.422926 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.476754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.476858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.477917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.499671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"placement-c7c6-account-create-update-bzdlt\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.507057 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.640905 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.708095 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:21:44 crc kubenswrapper[4883]: W0310 09:21:44.711141 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258d7844_9a92_460a_a768_a5dca2fb5db9.slice/crio-3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34 WatchSource:0}: Error finding container 3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34: Status 404 returned error can't find the container with id 3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.860266 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:21:44 crc kubenswrapper[4883]: W0310 09:21:44.863547 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486b3226_21be_4783_8b29_abaf747a7693.slice/crio-e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9 WatchSource:0}: Error finding container e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9: Status 404 returned error can't find the container with id e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.933895 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.988059 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerStarted","Data":"e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.989947 4883 generic.go:334] "Generic (PLEG): container finished" podID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerID="b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c" exitCode=0 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.990010 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerDied","Data":"b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.990041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerStarted","Data":"c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.993931 4883 generic.go:334] "Generic (PLEG): container finished" podID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerID="e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74" exitCode=0 Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.993983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerDied","Data":"e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74"} Mar 10 09:21:44 crc kubenswrapper[4883]: I0310 09:21:44.994002 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerStarted","Data":"b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:44.995946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerStarted","Data":"067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:44.995972 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerStarted","Data":"3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.003121 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerStarted","Data":"136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573"} Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.071975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.232051 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.232292 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" containerID="cri-o://8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" gracePeriod=10 Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.248369 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.267977 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.269415 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302648 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302697 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302754 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302785 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.302808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.310331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405489 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405542 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405737 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.405786 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.406949 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.407450 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.457174 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"dnsmasq-dns-675f7dd995-ll7zk\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.590800 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:45 crc kubenswrapper[4883]: I0310 09:21:45.890111 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012270 4883 generic.go:334] "Generic (PLEG): container finished" podID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerID="9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012395 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerDied","Data":"9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.012498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerStarted","Data":"fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.014454 4883 generic.go:334] "Generic (PLEG): container finished" podID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerID="067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.014555 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerDied","Data":"067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015145 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.015241 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.016250 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.016378 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") pod \"34eb524a-8ba3-4157-8a0c-efd069843d47\" (UID: \"34eb524a-8ba3-4157-8a0c-efd069843d47\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021032 4883 generic.go:334] "Generic (PLEG): container finished" podID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021128 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" event={"ID":"34eb524a-8ba3-4157-8a0c-efd069843d47","Type":"ContainerDied","Data":"217b807bc42e3fb9b90b1828f1947e2dec1e612d0200d8cf3f6228734fe68a4c"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021149 4883 scope.go:117] "RemoveContainer" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.021304 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-qq85c" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.030759 4883 generic.go:334] "Generic (PLEG): container finished" podID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerID="86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.030839 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerDied","Data":"86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.033749 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.036647 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f" (OuterVolumeSpecName: "kube-api-access-cqx5f") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "kube-api-access-cqx5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.037770 4883 generic.go:334] "Generic (PLEG): container finished" podID="486b3226-21be-4783-8b29-abaf747a7693" containerID="6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166" exitCode=0 Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.038031 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerDied","Data":"6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166"} Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.063649 4883 scope.go:117] "RemoveContainer" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.067936 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.075355 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config" (OuterVolumeSpecName: "config") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.080026 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.081727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34eb524a-8ba3-4157-8a0c-efd069843d47" (UID: "34eb524a-8ba3-4157-8a0c-efd069843d47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099013 4883 scope.go:117] "RemoveContainer" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.099438 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": container with ID starting with 8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351 not found: ID does not exist" containerID="8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099495 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351"} err="failed to get container status \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": rpc error: code = NotFound desc = could not find container \"8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351\": container with ID starting with 8269cca53bd6842d1b123435d5a022a762e81646331480446afa34ddc06ef351 not found: ID does not exist" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.099523 4883 scope.go:117] "RemoveContainer" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.100700 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": container with ID starting with ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210 not found: ID does not exist" containerID="ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.100727 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210"} err="failed to get container status \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": rpc error: code = NotFound desc = could not find container \"ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210\": container with ID starting with ed0b8a3362eba2e9cc0fd292e0c962bc2538d86ace6403371bd9cd0df9f27210 not found: ID does not exist" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119678 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119701 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119710 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119720 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34eb524a-8ba3-4157-8a0c-efd069843d47-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.119729 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqx5f\" (UniqueName: \"kubernetes.io/projected/34eb524a-8ba3-4157-8a0c-efd069843d47-kube-api-access-cqx5f\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.364979 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.368951 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.377715 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-qq85c"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.402726 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.414927 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415409 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415515 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415595 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415707 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="init" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415753 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="init" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.415821 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.415868 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416150 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" containerName="mariadb-account-create-update" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416233 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" containerName="dnsmasq-dns" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.416291 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.423496 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.423688 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.425443 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") pod \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.425749 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") pod \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\" (UID: \"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.426374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" (UID: "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.426464 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.427317 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.427985 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.428132 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lfqvv" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.428160 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.437382 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8" (OuterVolumeSpecName: "kube-api-access-pwvs8") pod "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" (UID: "84b33a38-a6de-4d7e-b24d-ecf5f95f6c34"). InnerVolumeSpecName "kube-api-access-pwvs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.440050 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.482268 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.483514 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.483600 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.483844 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" containerName="mariadb-database-create" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.484636 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488320 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.488767 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.499249 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.527768 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") pod \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528317 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") pod \"258d7844-9a92-460a-a768-a5dca2fb5db9\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") pod \"258d7844-9a92-460a-a768-a5dca2fb5db9\" (UID: \"258d7844-9a92-460a-a768-a5dca2fb5db9\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.528799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") pod \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\" (UID: \"698612ed-a736-4d3d-9a0e-4c75fdd1400f\") " Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529131 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "258d7844-9a92-460a-a768-a5dca2fb5db9" (UID: "258d7844-9a92-460a-a768-a5dca2fb5db9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529206 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "698612ed-a736-4d3d-9a0e-4c75fdd1400f" (UID: "698612ed-a736-4d3d-9a0e-4c75fdd1400f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529312 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529498 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529564 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529611 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529658 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529818 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529833 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529885 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.529952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530139 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/698612ed-a736-4d3d-9a0e-4c75fdd1400f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530177 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvs8\" (UniqueName: \"kubernetes.io/projected/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34-kube-api-access-pwvs8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.530191 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/258d7844-9a92-460a-a768-a5dca2fb5db9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.532037 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk" (OuterVolumeSpecName: "kube-api-access-dxrzk") pod "258d7844-9a92-460a-a768-a5dca2fb5db9" (UID: "258d7844-9a92-460a-a768-a5dca2fb5db9"). InnerVolumeSpecName "kube-api-access-dxrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.532994 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv" (OuterVolumeSpecName: "kube-api-access-dnddv") pod "698612ed-a736-4d3d-9a0e-4c75fdd1400f" (UID: "698612ed-a736-4d3d-9a0e-4c75fdd1400f"). InnerVolumeSpecName "kube-api-access-dnddv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632100 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632155 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632214 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632259 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632309 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632334 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632377 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632442 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632463 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632545 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnddv\" (UniqueName: \"kubernetes.io/projected/698612ed-a736-4d3d-9a0e-4c75fdd1400f-kube-api-access-dnddv\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632561 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxrzk\" (UniqueName: \"kubernetes.io/projected/258d7844-9a92-460a-a768-a5dca2fb5db9-kube-api-access-dxrzk\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.632995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-lock\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39fdf41f-a914-4d0f-8d0c-5e378567a2db-cache\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633165 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633238 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633259 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: E0310 09:21:46.633324 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:47.133304176 +0000 UTC m=+1093.388202065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.633917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.634944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.638566 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.639031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39fdf41f-a914-4d0f-8d0c-5e378567a2db-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.639883 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.640439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.649190 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2x8\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-kube-api-access-ng2x8\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.649687 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"swift-ring-rebalance-n4vhh\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.653836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:46 crc kubenswrapper[4883]: I0310 09:21:46.820613 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.054014 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n4vhh"] Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065182 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8903-account-create-update-lxrp4" event={"ID":"698612ed-a736-4d3d-9a0e-4c75fdd1400f","Type":"ContainerDied","Data":"b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065498 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b951262476bceeb4b809409cf98537b24497c176feb26e10d9d261770e483efe" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.065590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8903-account-create-update-lxrp4" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.079862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mj8nd" event={"ID":"258d7844-9a92-460a-a768-a5dca2fb5db9","Type":"ContainerDied","Data":"3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.080291 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3909945d92aa80c72296c578675f436135ee7b4e0c7c04b0dc225eb1b1e3ef34" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.080348 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mj8nd" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081710 4883 generic.go:334] "Generic (PLEG): container finished" podID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerID="e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4" exitCode=0 Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.081849 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerStarted","Data":"4fe8d37616588503394b5e0c543034c9d75405c62d38c6a3b4276c05a61d4d46"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091058 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pv8r6" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pv8r6" event={"ID":"84b33a38-a6de-4d7e-b24d-ecf5f95f6c34","Type":"ContainerDied","Data":"c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6"} Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.091928 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c2965142f5b5713694be3cd8baab99c20a6e06108db60b12703ba3ffb904b6" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.144356 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145535 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145570 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: E0310 09:21:47.145614 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:48.145596013 +0000 UTC m=+1094.400493902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.448184 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.544051 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.550310 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.551382 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") pod \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.551587 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") pod \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\" (UID: \"6195b8a8-c8aa-4d92-b58b-066a2df99bd3\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.552195 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6195b8a8-c8aa-4d92-b58b-066a2df99bd3" (UID: "6195b8a8-c8aa-4d92-b58b-066a2df99bd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.552335 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.560659 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd" (OuterVolumeSpecName: "kube-api-access-vf2pd") pod "6195b8a8-c8aa-4d92-b58b-066a2df99bd3" (UID: "6195b8a8-c8aa-4d92-b58b-066a2df99bd3"). InnerVolumeSpecName "kube-api-access-vf2pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653134 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") pod \"58599ed2-6176-4003-8bdc-2a1d805da51f\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") pod \"58599ed2-6176-4003-8bdc-2a1d805da51f\" (UID: \"58599ed2-6176-4003-8bdc-2a1d805da51f\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653815 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") pod \"486b3226-21be-4783-8b29-abaf747a7693\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653876 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") pod \"486b3226-21be-4783-8b29-abaf747a7693\" (UID: \"486b3226-21be-4783-8b29-abaf747a7693\") " Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.653807 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58599ed2-6176-4003-8bdc-2a1d805da51f" (UID: "58599ed2-6176-4003-8bdc-2a1d805da51f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654182 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58599ed2-6176-4003-8bdc-2a1d805da51f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654208 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2pd\" (UniqueName: \"kubernetes.io/projected/6195b8a8-c8aa-4d92-b58b-066a2df99bd3-kube-api-access-vf2pd\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.654252 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "486b3226-21be-4783-8b29-abaf747a7693" (UID: "486b3226-21be-4783-8b29-abaf747a7693"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.656430 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr" (OuterVolumeSpecName: "kube-api-access-kg4kr") pod "58599ed2-6176-4003-8bdc-2a1d805da51f" (UID: "58599ed2-6176-4003-8bdc-2a1d805da51f"). InnerVolumeSpecName "kube-api-access-kg4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.656784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5" (OuterVolumeSpecName: "kube-api-access-mvwk5") pod "486b3226-21be-4783-8b29-abaf747a7693" (UID: "486b3226-21be-4783-8b29-abaf747a7693"). InnerVolumeSpecName "kube-api-access-mvwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755670 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg4kr\" (UniqueName: \"kubernetes.io/projected/58599ed2-6176-4003-8bdc-2a1d805da51f-kube-api-access-kg4kr\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755699 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486b3226-21be-4783-8b29-abaf747a7693-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:47 crc kubenswrapper[4883]: I0310 09:21:47.755711 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwk5\" (UniqueName: \"kubernetes.io/projected/486b3226-21be-4783-8b29-abaf747a7693-kube-api-access-mvwk5\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.118920 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34eb524a-8ba3-4157-8a0c-efd069843d47" path="/var/lib/kubelet/pods/34eb524a-8ba3-4157-8a0c-efd069843d47/volumes" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.166131 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167498 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167526 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.167576 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:50.167559746 +0000 UTC m=+1096.422457636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.178833 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerStarted","Data":"ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.179897 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.256995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9kwf" event={"ID":"6195b8a8-c8aa-4d92-b58b-066a2df99bd3","Type":"ContainerDied","Data":"136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.257036 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136224520a357418a498376a1cbdd0153ccb3d2fdb86788ac7b44dece177b573" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.257113 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9kwf" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.270239 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerStarted","Data":"11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272122 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d500-account-create-update-fpfdr" event={"ID":"486b3226-21be-4783-8b29-abaf747a7693","Type":"ContainerDied","Data":"e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272177 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e075a6fd8268070db8b87167104eaeec692650a48d7b0ecfc8ce8d031b32fbf9" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.272280 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d500-account-create-update-fpfdr" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c7c6-account-create-update-bzdlt" event={"ID":"58599ed2-6176-4003-8bdc-2a1d805da51f","Type":"ContainerDied","Data":"fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe"} Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282833 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb11cd34a85d26912302a2922a691d05843fde0fdc020b7213b5e6c9c65ef2fe" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.282912 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c7c6-account-create-update-bzdlt" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.437153 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" podStartSLOduration=3.437103804 podStartE2EDuration="3.437103804s" podCreationTimestamp="2026-03-10 09:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:21:48.213763858 +0000 UTC m=+1094.468661747" watchObservedRunningTime="2026-03-10 09:21:48.437103804 +0000 UTC m=+1094.692001692" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443184 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443595 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443614 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443637 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: E0310 09:21:48.443659 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443666 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443828 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" containerName="mariadb-database-create" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443848 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.443864 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="486b3226-21be-4783-8b29-abaf747a7693" containerName="mariadb-account-create-update" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.447078 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.449891 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.450701 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4r2q" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.455584 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.577883 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.577957 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.578010 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.578146 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680244 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680383 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.680437 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.685575 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.687059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.688865 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.696233 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"glance-db-sync-xtrqg\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:48 crc kubenswrapper[4883]: I0310 09:21:48.773701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.272449 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.717256 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.720307 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.722390 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.725593 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.909525 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:49 crc kubenswrapper[4883]: I0310 09:21:49.909583 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.012208 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.012297 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.013307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.041069 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"root-account-create-update-zbx4l\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.217972 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.218022 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: E0310 09:21:50.218110 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:21:54.218080084 +0000 UTC m=+1100.472977974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.222793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:50 crc kubenswrapper[4883]: I0310 09:21:50.338119 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:51 crc kubenswrapper[4883]: W0310 09:21:51.153651 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5485539_c722_477d_b595_649e07eac50e.slice/crio-2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7 WatchSource:0}: Error finding container 2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7: Status 404 returned error can't find the container with id 2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7 Mar 10 09:21:51 crc kubenswrapper[4883]: I0310 09:21:51.335944 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerStarted","Data":"2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7"} Mar 10 09:21:51 crc kubenswrapper[4883]: I0310 09:21:51.568012 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:51 crc kubenswrapper[4883]: W0310 09:21:51.574179 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f11ff8e_7bba_408d_9d5f_6a3f3d16c280.slice/crio-7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb WatchSource:0}: Error finding container 7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb: Status 404 returned error can't find the container with id 7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.345534 4883 generic.go:334] "Generic (PLEG): container finished" podID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerID="86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493" exitCode=0 Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.345652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerDied","Data":"86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.347928 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerStarted","Data":"7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.347957 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerStarted","Data":"98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9"} Mar 10 09:21:52 crc kubenswrapper[4883]: I0310 09:21:52.379084 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-n4vhh" podStartSLOduration=2.26811956 podStartE2EDuration="6.379066627s" podCreationTimestamp="2026-03-10 09:21:46 +0000 UTC" firstStartedPulling="2026-03-10 09:21:47.072902964 +0000 UTC m=+1093.327800853" lastFinishedPulling="2026-03-10 09:21:51.183850031 +0000 UTC m=+1097.438747920" observedRunningTime="2026-03-10 09:21:52.376540263 +0000 UTC m=+1098.631438152" watchObservedRunningTime="2026-03-10 09:21:52.379066627 +0000 UTC m=+1098.633964516" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.670503 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") pod \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786155 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") pod \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\" (UID: \"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280\") " Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.786690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" (UID: "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.787085 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.793542 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8" (OuterVolumeSpecName: "kube-api-access-8j8q8") pod "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" (UID: "5f11ff8e-7bba-408d-9d5f-6a3f3d16c280"). InnerVolumeSpecName "kube-api-access-8j8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:53 crc kubenswrapper[4883]: I0310 09:21:53.888804 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j8q8\" (UniqueName: \"kubernetes.io/projected/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280-kube-api-access-8j8q8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.297347 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297597 4883 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297616 4883 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: E0310 09:21:54.297667 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift podName:39fdf41f-a914-4d0f-8d0c-5e378567a2db nodeName:}" failed. No retries permitted until 2026-03-10 09:22:02.297652693 +0000 UTC m=+1108.552550581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift") pod "swift-storage-0" (UID: "39fdf41f-a914-4d0f-8d0c-5e378567a2db") : configmap "swift-ring-files" not found Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.372911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zbx4l" event={"ID":"5f11ff8e-7bba-408d-9d5f-6a3f3d16c280","Type":"ContainerDied","Data":"7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb"} Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.372953 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca2ef89aed9e45eb9d3b9332aa73ac48af177844024c608d4f7b4c213e07acb" Mar 10 09:21:54 crc kubenswrapper[4883]: I0310 09:21:54.373011 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zbx4l" Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.592732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.638318 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:55 crc kubenswrapper[4883]: I0310 09:21:55.638624 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" containerID="cri-o://4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" gracePeriod=10 Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.114955 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136750 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.136981 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") pod \"6b114327-1a63-488a-aace-0488259b1278\" (UID: \"6b114327-1a63-488a-aace-0488259b1278\") " Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.155149 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8" (OuterVolumeSpecName: "kube-api-access-cksz8") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "kube-api-access-cksz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.185665 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config" (OuterVolumeSpecName: "config") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.190809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b114327-1a63-488a-aace-0488259b1278" (UID: "6b114327-1a63-488a-aace-0488259b1278"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240799 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksz8\" (UniqueName: \"kubernetes.io/projected/6b114327-1a63-488a-aace-0488259b1278-kube-api-access-cksz8\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240833 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.240843 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b114327-1a63-488a-aace-0488259b1278-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.244969 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.268115 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zbx4l"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405132 4883 generic.go:334] "Generic (PLEG): container finished" podID="6b114327-1a63-488a-aace-0488259b1278" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" exitCode=0 Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405212 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" event={"ID":"6b114327-1a63-488a-aace-0488259b1278","Type":"ContainerDied","Data":"0f7fbcddfd5a9e8e2b01f02ad16e0ed221a09ead1d4f04beb704150072ccc53c"} Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405266 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-ckpnm" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.405293 4883 scope.go:117] "RemoveContainer" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.438509 4883 scope.go:117] "RemoveContainer" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.448596 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.461711 4883 scope.go:117] "RemoveContainer" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462081 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-ckpnm"] Mar 10 09:21:56 crc kubenswrapper[4883]: E0310 09:21:56.462432 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": container with ID starting with 4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc not found: ID does not exist" containerID="4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462497 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc"} err="failed to get container status \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": rpc error: code = NotFound desc = could not find container \"4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc\": container with ID starting with 4a19a336933f8f62244d31940693399abdc9e3c24b9f8469529d8b74522ed9bc not found: ID does not exist" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.462533 4883 scope.go:117] "RemoveContainer" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: E0310 09:21:56.463033 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": container with ID starting with 0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a not found: ID does not exist" containerID="0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a" Mar 10 09:21:56 crc kubenswrapper[4883]: I0310 09:21:56.463081 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a"} err="failed to get container status \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": rpc error: code = NotFound desc = could not find container \"0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a\": container with ID starting with 0e41f839ac73722addb0d9863cab50b44083196bfea2e019427104c8fad4a08a not found: ID does not exist" Mar 10 09:21:57 crc kubenswrapper[4883]: I0310 09:21:57.413465 4883 generic.go:334] "Generic (PLEG): container finished" podID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerID="98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9" exitCode=0 Mar 10 09:21:57 crc kubenswrapper[4883]: I0310 09:21:57.413538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerDied","Data":"98e65d47610f9e922ce3470f42049e9e4521b7087bcbd4c73749d28c484a5cf9"} Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.092251 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" path="/var/lib/kubelet/pods/5f11ff8e-7bba-408d-9d5f-6a3f3d16c280/volumes" Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.093924 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b114327-1a63-488a-aace-0488259b1278" path="/var/lib/kubelet/pods/6b114327-1a63-488a-aace-0488259b1278/volumes" Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.422423 4883 generic.go:334] "Generic (PLEG): container finished" podID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" exitCode=0 Mar 10 09:21:58 crc kubenswrapper[4883]: I0310 09:21:58.422600 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} Mar 10 09:21:59 crc kubenswrapper[4883]: I0310 09:21:59.433084 4883 generic.go:334] "Generic (PLEG): container finished" podID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerID="cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f" exitCode=0 Mar 10 09:21:59 crc kubenswrapper[4883]: I0310 09:21:59.433177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f"} Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146492 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146941 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146970 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146977 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: E0310 09:22:00.146987 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="init" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.146993 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="init" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147149 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b114327-1a63-488a-aace-0488259b1278" containerName="dnsmasq-dns" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147169 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f11ff8e-7bba-408d-9d5f-6a3f3d16c280" containerName="mariadb-account-create-update" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.147804 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149683 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149716 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.149879 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.157024 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.316331 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.417914 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.446773 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"auto-csr-approver-29552242-kz9jr\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:00 crc kubenswrapper[4883]: I0310 09:22:00.463861 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.256675 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.257900 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.260580 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.263888 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.451823 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.451991 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.465810 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.553879 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.554038 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.556527 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.570012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"root-account-create-update-cdc4t\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:01 crc kubenswrapper[4883]: I0310 09:22:01.585701 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.370324 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.377621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39fdf41f-a914-4d0f-8d0c-5e378567a2db-etc-swift\") pod \"swift-storage-0\" (UID: \"39fdf41f-a914-4d0f-8d0c-5e378567a2db\") " pod="openstack/swift-storage-0" Mar 10 09:22:02 crc kubenswrapper[4883]: I0310 09:22:02.653979 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.164190 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.293918 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.293991 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294018 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294059 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.294385 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") pod \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\" (UID: \"cbe93226-96c7-4854-abdc-4afe54ad7ad5\") " Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.295219 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.295642 4883 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cbe93226-96c7-4854-abdc-4afe54ad7ad5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.296405 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.301017 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g" (OuterVolumeSpecName: "kube-api-access-9s66g") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "kube-api-access-9s66g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.304585 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.312802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts" (OuterVolumeSpecName: "scripts") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.315200 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.317263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbe93226-96c7-4854-abdc-4afe54ad7ad5" (UID: "cbe93226-96c7-4854-abdc-4afe54ad7ad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398326 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398363 4883 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398379 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s66g\" (UniqueName: \"kubernetes.io/projected/cbe93226-96c7-4854-abdc-4afe54ad7ad5-kube-api-access-9s66g\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398389 4883 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398400 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe93226-96c7-4854-abdc-4afe54ad7ad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.398411 4883 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cbe93226-96c7-4854-abdc-4afe54ad7ad5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.486175 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerStarted","Data":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.486826 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488257 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n4vhh" event={"ID":"cbe93226-96c7-4854-abdc-4afe54ad7ad5","Type":"ContainerDied","Data":"11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488317 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c349d8ba3b1aaf24065124e390760d2cc6670f859985a574e75a6f9d822d8c" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.488333 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n4vhh" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.493133 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerStarted","Data":"6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e"} Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.493394 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.528210 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.965400521 podStartE2EDuration="55.528194601s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:10.207186917 +0000 UTC m=+1056.462084806" lastFinishedPulling="2026-03-10 09:21:25.769980996 +0000 UTC m=+1072.024878886" observedRunningTime="2026-03-10 09:22:03.513306147 +0000 UTC m=+1109.768204036" watchObservedRunningTime="2026-03-10 09:22:03.528194601 +0000 UTC m=+1109.783092490" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.549876 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.057129204 podStartE2EDuration="55.549856418s" podCreationTimestamp="2026-03-10 09:21:08 +0000 UTC" firstStartedPulling="2026-03-10 09:21:10.409462633 +0000 UTC m=+1056.664360522" lastFinishedPulling="2026-03-10 09:21:25.902189847 +0000 UTC m=+1072.157087736" observedRunningTime="2026-03-10 09:22:03.537217365 +0000 UTC m=+1109.792115255" watchObservedRunningTime="2026-03-10 09:22:03.549856418 +0000 UTC m=+1109.804754308" Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.560997 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.578964 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d523ed0_183e_4bec_a110_fe622b69ef79.slice/crio-35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce WatchSource:0}: Error finding container 35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce: Status 404 returned error can't find the container with id 35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.588518 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.588643 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39fdf41f_a914_4d0f_8d0c_5e378567a2db.slice/crio-0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011 WatchSource:0}: Error finding container 0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011: Status 404 returned error can't find the container with id 0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011 Mar 10 09:22:03 crc kubenswrapper[4883]: I0310 09:22:03.973717 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:22:03 crc kubenswrapper[4883]: W0310 09:22:03.977447 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12cada45_6ba5_4db1_9a13_3de652b390bb.slice/crio-0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a WatchSource:0}: Error finding container 0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a: Status 404 returned error can't find the container with id 0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.507206 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"0e48c8f74b7202798c3d97547fdee7356db2adb79887d4eb2a54a8474d824011"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.511983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerStarted","Data":"a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.516801 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerStarted","Data":"0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.519886 4883 generic.go:334] "Generic (PLEG): container finished" podID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerID="affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72" exitCode=0 Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.519953 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerDied","Data":"affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.520033 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerStarted","Data":"35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce"} Mar 10 09:22:04 crc kubenswrapper[4883]: I0310 09:22:04.535735 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xtrqg" podStartSLOduration=4.637286189 podStartE2EDuration="16.535724759s" podCreationTimestamp="2026-03-10 09:21:48 +0000 UTC" firstStartedPulling="2026-03-10 09:21:51.167767727 +0000 UTC m=+1097.422665615" lastFinishedPulling="2026-03-10 09:22:03.066206297 +0000 UTC m=+1109.321104185" observedRunningTime="2026-03-10 09:22:04.530584075 +0000 UTC m=+1110.785481964" watchObservedRunningTime="2026-03-10 09:22:04.535724759 +0000 UTC m=+1110.790622648" Mar 10 09:22:05 crc kubenswrapper[4883]: I0310 09:22:05.542896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerStarted","Data":"ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8"} Mar 10 09:22:05 crc kubenswrapper[4883]: I0310 09:22:05.564608 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" podStartSLOduration=4.555954675 podStartE2EDuration="5.564586019s" podCreationTimestamp="2026-03-10 09:22:00 +0000 UTC" firstStartedPulling="2026-03-10 09:22:03.980417242 +0000 UTC m=+1110.235315141" lastFinishedPulling="2026-03-10 09:22:04.989048596 +0000 UTC m=+1111.243946485" observedRunningTime="2026-03-10 09:22:05.560999166 +0000 UTC m=+1111.815897055" watchObservedRunningTime="2026-03-10 09:22:05.564586019 +0000 UTC m=+1111.819483908" Mar 10 09:22:06 crc kubenswrapper[4883]: I0310 09:22:06.547771 4883 generic.go:334] "Generic (PLEG): container finished" podID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerID="ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8" exitCode=0 Mar 10 09:22:06 crc kubenswrapper[4883]: I0310 09:22:06.547834 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerDied","Data":"ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8"} Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.858351 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lb2z9" podUID="6691939e-adb0-420c-bf9e-f4a9b670c83b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 09:22:08 crc kubenswrapper[4883]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 09:22:08 crc kubenswrapper[4883]: > Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.858391 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:22:08 crc kubenswrapper[4883]: I0310 09:22:08.862379 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qrl4s" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.074372 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:09 crc kubenswrapper[4883]: E0310 09:22:09.075349 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.075380 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.075870 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe93226-96c7-4854-abdc-4afe54ad7ad5" containerName="swift-ring-rebalance" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.076778 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.083208 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.104526 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206051 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206490 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206710 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206767 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.206998 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309643 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309754 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309838 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309901 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.309965 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310202 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310221 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310220 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.310799 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.312109 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.331632 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"ovn-controller-lb2z9-config-xlg86\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.392250 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.824514 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.828679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.922875 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") pod \"12cada45-6ba5-4db1-9a13-3de652b390bb\" (UID: \"12cada45-6ba5-4db1-9a13-3de652b390bb\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.923253 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") pod \"5d523ed0-183e-4bec-a110-fe622b69ef79\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.923280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") pod \"5d523ed0-183e-4bec-a110-fe622b69ef79\" (UID: \"5d523ed0-183e-4bec-a110-fe622b69ef79\") " Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.926974 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d523ed0-183e-4bec-a110-fe622b69ef79" (UID: "5d523ed0-183e-4bec-a110-fe622b69ef79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.927075 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m" (OuterVolumeSpecName: "kube-api-access-jbm4m") pod "5d523ed0-183e-4bec-a110-fe622b69ef79" (UID: "5d523ed0-183e-4bec-a110-fe622b69ef79"). InnerVolumeSpecName "kube-api-access-jbm4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:09 crc kubenswrapper[4883]: I0310 09:22:09.933619 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225" (OuterVolumeSpecName: "kube-api-access-hv225") pod "12cada45-6ba5-4db1-9a13-3de652b390bb" (UID: "12cada45-6ba5-4db1-9a13-3de652b390bb"). InnerVolumeSpecName "kube-api-access-hv225". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025508 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d523ed0-183e-4bec-a110-fe622b69ef79-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025538 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbm4m\" (UniqueName: \"kubernetes.io/projected/5d523ed0-183e-4bec-a110-fe622b69ef79-kube-api-access-jbm4m\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.025552 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv225\" (UniqueName: \"kubernetes.io/projected/12cada45-6ba5-4db1-9a13-3de652b390bb-kube-api-access-hv225\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.224503 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:10 crc kubenswrapper[4883]: W0310 09:22:10.239996 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ae19cc_a5a6_40c2_81c1_c85d80abf698.slice/crio-97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e WatchSource:0}: Error finding container 97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e: Status 404 returned error can't find the container with id 97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.580652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerStarted","Data":"969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.581162 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerStarted","Data":"97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583590 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cdc4t" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583567 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cdc4t" event={"ID":"5d523ed0-183e-4bec-a110-fe622b69ef79","Type":"ContainerDied","Data":"35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.583716 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35575a977905ac6263222ca887674a296d87a0c48aab15cc70d1a61b9ed7b6ce" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"cde66402e8a113a08f3939caaddb582cd76af576918a9313978e4264e146e64c"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591783 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"5e5101b2c9fe4b924daf78660657a6f7ff64d89fc1dc8ea69534d49f205a3790"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"4301134820ef18f16447d81d9bcd70b68e4721bd974daacf289ff2cae961b4c2"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.591804 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"60f26fdfcd1fd4afc796a7cd4e553b7137464997d77a8e0c525b10010919d749"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" event={"ID":"12cada45-6ba5-4db1-9a13-3de652b390bb","Type":"ContainerDied","Data":"0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a"} Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593747 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0851aca758d59994ecfe5174aff7960510a4cfe47e7e41268f80ccf57084995a" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.593935 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-kz9jr" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.599827 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lb2z9-config-xlg86" podStartSLOduration=1.599809065 podStartE2EDuration="1.599809065s" podCreationTimestamp="2026-03-10 09:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:10.597072596 +0000 UTC m=+1116.851970484" watchObservedRunningTime="2026-03-10 09:22:10.599809065 +0000 UTC m=+1116.854706955" Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.905021 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:22:10 crc kubenswrapper[4883]: I0310 09:22:10.912624 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-hdd6d"] Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.605032 4883 generic.go:334] "Generic (PLEG): container finished" podID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerID="969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5" exitCode=0 Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.605162 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lb2z9-config-xlg86" event={"ID":"e0ae19cc-a5a6-40c2-81c1-c85d80abf698","Type":"ContainerDied","Data":"969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5"} Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.607955 4883 generic.go:334] "Generic (PLEG): container finished" podID="d5485539-c722-477d-b595-649e07eac50e" containerID="a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42" exitCode=0 Mar 10 09:22:11 crc kubenswrapper[4883]: I0310 09:22:11.607993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerDied","Data":"a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.107971 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a502d2-d219-4f01-aebc-f27fb7766458" path="/var/lib/kubelet/pods/a2a502d2-d219-4f01-aebc-f27fb7766458/volumes" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"20a3c9c9b07e0b3bbf86ab05587abec2ae8fb8c885231972c34fd3870d545603"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f840485d0b5166b267e837f8a420a390e5e2179416cc947e579e7af464d5d9fd"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620693 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"6d07d8124acd370018c4e4fc31f9f249ffaab52b09d4c2e03bac9d7bae73169c"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.620703 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"5ca3f23453da824e8b3a92885593c23039499f92e18020946ff11d3ad0255a92"} Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.887442 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.922190 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989132 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989258 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run" (OuterVolumeSpecName: "var-run") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989294 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989371 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989410 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989484 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.989503 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") pod \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\" (UID: \"e0ae19cc-a5a6-40c2-81c1-c85d80abf698\") " Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990009 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990214 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts" (OuterVolumeSpecName: "scripts") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990305 4883 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990331 4883 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990344 4883 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990351 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.990454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:22:12 crc kubenswrapper[4883]: I0310 09:22:12.993837 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7" (OuterVolumeSpecName: "kube-api-access-c4jk7") pod "e0ae19cc-a5a6-40c2-81c1-c85d80abf698" (UID: "e0ae19cc-a5a6-40c2-81c1-c85d80abf698"). InnerVolumeSpecName "kube-api-access-c4jk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.090993 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091297 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091335 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") pod \"d5485539-c722-477d-b595-649e07eac50e\" (UID: \"d5485539-c722-477d-b595-649e07eac50e\") " Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091815 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4jk7\" (UniqueName: \"kubernetes.io/projected/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-kube-api-access-c4jk7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.091839 4883 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e0ae19cc-a5a6-40c2-81c1-c85d80abf698-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.094652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.096415 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd" (OuterVolumeSpecName: "kube-api-access-s9qnd") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "kube-api-access-s9qnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.111372 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.123613 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data" (OuterVolumeSpecName: "config-data") pod "d5485539-c722-477d-b595-649e07eac50e" (UID: "d5485539-c722-477d-b595-649e07eac50e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194375 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194781 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194799 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5485539-c722-477d-b595-649e07eac50e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.194813 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qnd\" (UniqueName: \"kubernetes.io/projected/d5485539-c722-477d-b595-649e07eac50e-kube-api-access-s9qnd\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.327017 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.340703 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lb2z9-config-xlg86"] Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.632344 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97319dd87f2b38ac3b029fb472148ced2fb64f21995e1465c95aad375bf46b1e" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.632449 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lb2z9-config-xlg86" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635051 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xtrqg" event={"ID":"d5485539-c722-477d-b595-649e07eac50e","Type":"ContainerDied","Data":"2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7"} Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635125 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cb0092f81baba80b5e32ffbad7cc2ef9d1d4d56b87128258545a452766054f7" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.635151 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xtrqg" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.921988 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lb2z9" Mar 10 09:22:13 crc kubenswrapper[4883]: I0310 09:22:13.999232 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.009917 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010458 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010584 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010646 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010718 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010766 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: E0310 09:22:14.010821 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.010875 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011137 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5485539-c722-477d-b595-649e07eac50e" containerName="glance-db-sync" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011196 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" containerName="ovn-config" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" containerName="mariadb-account-create-update" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.011325 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" containerName="oc" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.012301 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.019137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.102778 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ae19cc-a5a6-40c2-81c1-c85d80abf698" path="/var/lib/kubelet/pods/e0ae19cc-a5a6-40c2-81c1-c85d80abf698/volumes" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111200 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111664 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.111791 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214659 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214843 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214891 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.214980 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.215690 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.215765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.216160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.216919 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.233347 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"dnsmasq-dns-7f58d6bb6f-l9wzv\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.328938 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.659330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f7334b9c5072a37af68e44df13c967bdc36c3bc3e3ad13a8af7e3548b95adf0e"} Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.659709 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"f163ae110b1cda0c4cf72333733a9a16fbb6fc3fcf03143dfd294ff6b6f85791"} Mar 10 09:22:14 crc kubenswrapper[4883]: I0310 09:22:14.757084 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673009 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"dddd2526d3626ca272ce3dde931298a46bd49bc7885f7181adfdfff8ef818b33"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673392 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"0c0337430709f67cbf74510fb4fc66182c5a0926ade86b0871b6a9763923b8ef"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673406 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"b4337a1a9bdcb12179e815ec153ed87760dff0dd587c319ff7e107ff0c109245"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"2254f6d1873b619efb991750d1877e62722be76753968c3e9c37772f04278339"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.673424 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39fdf41f-a914-4d0f-8d0c-5e378567a2db","Type":"ContainerStarted","Data":"ea5fd364998bd0d772f65137af11f496dc0e207c51e0d266e7a6fbb60f23707c"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684319 4883 generic.go:334] "Generic (PLEG): container finished" podID="949789c9-e015-4172-90f8-9a97607f3cd0" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" exitCode=0 Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.684400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerStarted","Data":"d6379cc5f0f0d1e68546e9664112db8e5584a977dbf3b98f9500eb8f5898a6cd"} Mar 10 09:22:15 crc kubenswrapper[4883]: I0310 09:22:15.714607 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.000563198 podStartE2EDuration="30.714589051s" podCreationTimestamp="2026-03-10 09:21:45 +0000 UTC" firstStartedPulling="2026-03-10 09:22:03.591952289 +0000 UTC m=+1109.846850178" lastFinishedPulling="2026-03-10 09:22:14.305978142 +0000 UTC m=+1120.560876031" observedRunningTime="2026-03-10 09:22:15.706317805 +0000 UTC m=+1121.961215694" watchObservedRunningTime="2026-03-10 09:22:15.714589051 +0000 UTC m=+1121.969486941" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.045293 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.069169 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.070802 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.072447 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.090435 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155448 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155771 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.155862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156161 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.156255 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258336 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258395 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258421 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.258551 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.259640 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.259892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260004 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.260375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.276892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"dnsmasq-dns-75c886f8b5-scjqt\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.385368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.693163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerStarted","Data":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.693523 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.709863 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" podStartSLOduration=3.70984843 podStartE2EDuration="3.70984843s" podCreationTimestamp="2026-03-10 09:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:16.706515404 +0000 UTC m=+1122.961413294" watchObservedRunningTime="2026-03-10 09:22:16.70984843 +0000 UTC m=+1122.964746319" Mar 10 09:22:16 crc kubenswrapper[4883]: I0310 09:22:16.799037 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:16 crc kubenswrapper[4883]: W0310 09:22:16.801221 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod631a0fc2_de6d_4778_bce2_46b69c306e44.slice/crio-5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c WatchSource:0}: Error finding container 5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c: Status 404 returned error can't find the container with id 5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704180 4883 generic.go:334] "Generic (PLEG): container finished" podID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerID="e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f" exitCode=0 Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704233 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f"} Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704665 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerStarted","Data":"5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c"} Mar 10 09:22:17 crc kubenswrapper[4883]: I0310 09:22:17.704745 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" containerID="cri-o://09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" gracePeriod=10 Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.049649 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203295 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203661 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.203747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") pod \"949789c9-e015-4172-90f8-9a97607f3cd0\" (UID: \"949789c9-e015-4172-90f8-9a97607f3cd0\") " Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.208550 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b" (OuterVolumeSpecName: "kube-api-access-x5d6b") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "kube-api-access-x5d6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.231155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.234329 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.235745 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.236400 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config" (OuterVolumeSpecName: "config") pod "949789c9-e015-4172-90f8-9a97607f3cd0" (UID: "949789c9-e015-4172-90f8-9a97607f3cd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307846 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307876 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307891 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307903 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5d6b\" (UniqueName: \"kubernetes.io/projected/949789c9-e015-4172-90f8-9a97607f3cd0-kube-api-access-x5d6b\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.307913 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949789c9-e015-4172-90f8-9a97607f3cd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715626 4883 generic.go:334] "Generic (PLEG): container finished" podID="949789c9-e015-4172-90f8-9a97607f3cd0" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" exitCode=0 Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715701 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715729 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715795 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-l9wzv" event={"ID":"949789c9-e015-4172-90f8-9a97607f3cd0","Type":"ContainerDied","Data":"d6379cc5f0f0d1e68546e9664112db8e5584a977dbf3b98f9500eb8f5898a6cd"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.715825 4883 scope.go:117] "RemoveContainer" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.718070 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerStarted","Data":"751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2"} Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.718236 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.733604 4883 scope.go:117] "RemoveContainer" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.756945 4883 scope.go:117] "RemoveContainer" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: E0310 09:22:18.757360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": container with ID starting with 09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225 not found: ID does not exist" containerID="09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.757400 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225"} err="failed to get container status \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": rpc error: code = NotFound desc = could not find container \"09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225\": container with ID starting with 09714cc863133b2c63ec7f8e8ba982693302e05d49a4c1dfc82035fd99ae2225 not found: ID does not exist" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.757426 4883 scope.go:117] "RemoveContainer" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: E0310 09:22:18.758092 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": container with ID starting with bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48 not found: ID does not exist" containerID="bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.758131 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48"} err="failed to get container status \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": rpc error: code = NotFound desc = could not find container \"bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48\": container with ID starting with bb4581878c323c5da4c41c90a830b209c80bab5bf1de15377081ce5b28300a48 not found: ID does not exist" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.769945 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podStartSLOduration=2.769931593 podStartE2EDuration="2.769931593s" podCreationTimestamp="2026-03-10 09:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:18.766097163 +0000 UTC m=+1125.020995052" watchObservedRunningTime="2026-03-10 09:22:18.769931593 +0000 UTC m=+1125.024829482" Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.784077 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:18 crc kubenswrapper[4883]: I0310 09:22:18.791414 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-l9wzv"] Mar 10 09:22:19 crc kubenswrapper[4883]: I0310 09:22:19.700712 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:22:19 crc kubenswrapper[4883]: I0310 09:22:19.975777 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 09:22:20 crc kubenswrapper[4883]: I0310 09:22:20.087738 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" path="/var/lib/kubelet/pods/949789c9-e015-4172-90f8-9a97607f3cd0/volumes" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.193745 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:21 crc kubenswrapper[4883]: E0310 09:22:21.194329 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194343 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: E0310 09:22:21.194352 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="init" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194358 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="init" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.194559 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="949789c9-e015-4172-90f8-9a97607f3cd0" containerName="dnsmasq-dns" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.195106 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.211720 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.300152 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.301370 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.322565 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.323909 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.327752 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.337276 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.341980 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.379824 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.380040 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481535 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481662 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481697 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.481773 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.482532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.482733 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.483173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.494115 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.495349 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.496962 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.514701 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.515879 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.522000 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.527208 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.529711 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"cinder-db-create-8664j\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.565246 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.566461 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571294 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571772 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571787 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.571911 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584878 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584932 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.584963 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.585145 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.585930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.586039 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.603181 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.604368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.605861 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.610894 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.622387 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"cinder-27c8-account-create-update-9w2q5\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.625117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"barbican-db-create-wzxkv\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.635126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688325 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688788 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688891 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.688947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.689066 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.689099 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790876 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790955 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.790990 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791072 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791806 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.791900 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.792538 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.792794 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.796613 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.800656 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.807469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"neutron-db-create-hrq22\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.807966 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"neutron-ef9a-account-create-update-4bwrd\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.811551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.812432 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"keystone-db-sync-w5q6s\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.824863 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.858141 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.893741 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.898689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.898937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.899635 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.924250 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:21 crc kubenswrapper[4883]: I0310 09:22:21.952286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"barbican-fca8-account-create-update-7jkwx\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.030247 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.147593 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.176458 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc357ec28_9cec_42e8_9e4d_dc1fb9960bc7.slice/crio-3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00 WatchSource:0}: Error finding container 3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00: Status 404 returned error can't find the container with id 3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.196064 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.217244 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded176738_d518_45e3_be47_3ace090d0e7a.slice/crio-58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272 WatchSource:0}: Error finding container 58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272: Status 404 returned error can't find the container with id 58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.539306 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.604838 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.651996 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.671048 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.684242 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:22:22 crc kubenswrapper[4883]: W0310 09:22:22.699766 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94df275b_e089_4e1f_8eac_e4806d2f1178.slice/crio-3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53 WatchSource:0}: Error finding container 3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53: Status 404 returned error can't find the container with id 3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761105 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed176738-d518-45e3-be47-3ace090d0e7a" containerID="c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e" exitCode=0 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerDied","Data":"c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.761220 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerStarted","Data":"58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.762862 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerStarted","Data":"4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.765293 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerStarted","Data":"be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767194 4883 generic.go:334] "Generic (PLEG): container finished" podID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerID="a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724" exitCode=0 Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767271 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerDied","Data":"a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.767307 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerStarted","Data":"3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.768611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerStarted","Data":"3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.780417 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerStarted","Data":"dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3"} Mar 10 09:22:22 crc kubenswrapper[4883]: I0310 09:22:22.782031 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerStarted","Data":"9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.791813 4883 generic.go:334] "Generic (PLEG): container finished" podID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerID="98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.791908 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerDied","Data":"98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.794420 4883 generic.go:334] "Generic (PLEG): container finished" podID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerID="5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.794499 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerDied","Data":"5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.795998 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerID="8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.796073 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerDied","Data":"8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce"} Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.797502 4883 generic.go:334] "Generic (PLEG): container finished" podID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerID="9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501" exitCode=0 Mar 10 09:22:23 crc kubenswrapper[4883]: I0310 09:22:23.797739 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerDied","Data":"9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.177050 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.181943 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350484 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") pod \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350524 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") pod \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\" (UID: \"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350554 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") pod \"ed176738-d518-45e3-be47-3ace090d0e7a\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.350581 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") pod \"ed176738-d518-45e3-be47-3ace090d0e7a\" (UID: \"ed176738-d518-45e3-be47-3ace090d0e7a\") " Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.351763 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed176738-d518-45e3-be47-3ace090d0e7a" (UID: "ed176738-d518-45e3-be47-3ace090d0e7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.356549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" (UID: "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.367654 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc" (OuterVolumeSpecName: "kube-api-access-lfmvc") pod "ed176738-d518-45e3-be47-3ace090d0e7a" (UID: "ed176738-d518-45e3-be47-3ace090d0e7a"). InnerVolumeSpecName "kube-api-access-lfmvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.376298 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk" (OuterVolumeSpecName: "kube-api-access-x7qwk") pod "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" (UID: "c357ec28-9cec-42e8-9e4d-dc1fb9960bc7"). InnerVolumeSpecName "kube-api-access-x7qwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454190 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454224 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7qwk\" (UniqueName: \"kubernetes.io/projected/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7-kube-api-access-x7qwk\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454237 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfmvc\" (UniqueName: \"kubernetes.io/projected/ed176738-d518-45e3-be47-3ace090d0e7a-kube-api-access-lfmvc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.454248 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed176738-d518-45e3-be47-3ace090d0e7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.812760 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef9a-account-create-update-4bwrd" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.812763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef9a-account-create-update-4bwrd" event={"ID":"ed176738-d518-45e3-be47-3ace090d0e7a","Type":"ContainerDied","Data":"58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.813171 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b015606f21414ba323487161bd4b998e740b02422f73f204ccd263aafe0272" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827133 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-27c8-account-create-update-9w2q5" Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-27c8-account-create-update-9w2q5" event={"ID":"c357ec28-9cec-42e8-9e4d-dc1fb9960bc7","Type":"ContainerDied","Data":"3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00"} Mar 10 09:22:24 crc kubenswrapper[4883]: I0310 09:22:24.827603 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3539e3cd3e3904f08f633f29769d6e5b99ec33a5b0a82fae13aeeab5f98d7f00" Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.387339 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.453875 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.457259 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" containerID="cri-o://ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" gracePeriod=10 Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.850148 4883 generic.go:334] "Generic (PLEG): container finished" podID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerID="ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" exitCode=0 Mar 10 09:22:26 crc kubenswrapper[4883]: I0310 09:22:26.850200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.282396 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.322000 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.344564 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.356259 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410589 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") pod \"94df275b-e089-4e1f-8eac-e4806d2f1178\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410637 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") pod \"ed7aa202-c734-4333-a1de-1bdb39d59804\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410692 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") pod \"94df275b-e089-4e1f-8eac-e4806d2f1178\" (UID: \"94df275b-e089-4e1f-8eac-e4806d2f1178\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410717 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") pod \"24713bd6-5868-43ec-94ec-2371a49a0b88\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") pod \"07a8b78f-e864-49d5-9dfb-aebd86741885\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410841 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") pod \"24713bd6-5868-43ec-94ec-2371a49a0b88\" (UID: \"24713bd6-5868-43ec-94ec-2371a49a0b88\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410923 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") pod \"ed7aa202-c734-4333-a1de-1bdb39d59804\" (UID: \"ed7aa202-c734-4333-a1de-1bdb39d59804\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.410944 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") pod \"07a8b78f-e864-49d5-9dfb-aebd86741885\" (UID: \"07a8b78f-e864-49d5-9dfb-aebd86741885\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.411911 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07a8b78f-e864-49d5-9dfb-aebd86741885" (UID: "07a8b78f-e864-49d5-9dfb-aebd86741885"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.412457 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed7aa202-c734-4333-a1de-1bdb39d59804" (UID: "ed7aa202-c734-4333-a1de-1bdb39d59804"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.412937 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24713bd6-5868-43ec-94ec-2371a49a0b88" (UID: "24713bd6-5868-43ec-94ec-2371a49a0b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.413670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94df275b-e089-4e1f-8eac-e4806d2f1178" (UID: "94df275b-e089-4e1f-8eac-e4806d2f1178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.415911 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx" (OuterVolumeSpecName: "kube-api-access-lt7rx") pod "24713bd6-5868-43ec-94ec-2371a49a0b88" (UID: "24713bd6-5868-43ec-94ec-2371a49a0b88"). InnerVolumeSpecName "kube-api-access-lt7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.415994 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm" (OuterVolumeSpecName: "kube-api-access-v6tlm") pod "07a8b78f-e864-49d5-9dfb-aebd86741885" (UID: "07a8b78f-e864-49d5-9dfb-aebd86741885"). InnerVolumeSpecName "kube-api-access-v6tlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.417679 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf" (OuterVolumeSpecName: "kube-api-access-kkbcf") pod "ed7aa202-c734-4333-a1de-1bdb39d59804" (UID: "ed7aa202-c734-4333-a1de-1bdb39d59804"). InnerVolumeSpecName "kube-api-access-kkbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.417951 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm" (OuterVolumeSpecName: "kube-api-access-k94lm") pod "94df275b-e089-4e1f-8eac-e4806d2f1178" (UID: "94df275b-e089-4e1f-8eac-e4806d2f1178"). InnerVolumeSpecName "kube-api-access-k94lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.475138 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524510 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tlm\" (UniqueName: \"kubernetes.io/projected/07a8b78f-e864-49d5-9dfb-aebd86741885-kube-api-access-v6tlm\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524551 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24713bd6-5868-43ec-94ec-2371a49a0b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524562 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7aa202-c734-4333-a1de-1bdb39d59804-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524684 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07a8b78f-e864-49d5-9dfb-aebd86741885-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524702 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94lm\" (UniqueName: \"kubernetes.io/projected/94df275b-e089-4e1f-8eac-e4806d2f1178-kube-api-access-k94lm\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524713 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkbcf\" (UniqueName: \"kubernetes.io/projected/ed7aa202-c734-4333-a1de-1bdb39d59804-kube-api-access-kkbcf\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524724 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94df275b-e089-4e1f-8eac-e4806d2f1178-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.524734 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt7rx\" (UniqueName: \"kubernetes.io/projected/24713bd6-5868-43ec-94ec-2371a49a0b88-kube-api-access-lt7rx\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626012 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626103 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626279 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626392 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.626436 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.630168 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz" (OuterVolumeSpecName: "kube-api-access-27qgz") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "kube-api-access-27qgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.660988 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.662247 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: E0310 09:22:27.666766 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config podName:7b8b44ba-0853-4287-a0ba-ef1607c66d7b nodeName:}" failed. No retries permitted until 2026-03-10 09:22:28.166731236 +0000 UTC m=+1134.421629135 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b") : error deleting /var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volume-subpaths: remove /var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volume-subpaths: no such file or directory Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.667047 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729919 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729950 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729965 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27qgz\" (UniqueName: \"kubernetes.io/projected/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-kube-api-access-27qgz\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.729975 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.861896 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hrq22" event={"ID":"24713bd6-5868-43ec-94ec-2371a49a0b88","Type":"ContainerDied","Data":"dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.862012 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf762b1ecb64eb779af79450c42147ea6c3038641d43eb500690891663077b3" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.862121 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hrq22" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8664j" event={"ID":"ed7aa202-c734-4333-a1de-1bdb39d59804","Type":"ContainerDied","Data":"9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869831 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5b8b914e9f13ae13ab5c11566677509e8d1e77ce6e654bf5a93d7fab30b16c" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.869926 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8664j" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874096 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-fca8-account-create-update-7jkwx" event={"ID":"07a8b78f-e864-49d5-9dfb-aebd86741885","Type":"ContainerDied","Data":"4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874147 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4564e46ddd506f32a201d9f1057e104c34fe9ab13966b5aa8a79230af780d4c0" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.874220 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-fca8-account-create-update-7jkwx" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.876978 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerStarted","Data":"e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880280 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wzxkv" event={"ID":"94df275b-e089-4e1f-8eac-e4806d2f1178","Type":"ContainerDied","Data":"3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880308 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca9fa69a71dd94dd374770fb82f842201a6cc3e02147ecade5cd635caacef53" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.880364 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wzxkv" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882225 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" event={"ID":"7b8b44ba-0853-4287-a0ba-ef1607c66d7b","Type":"ContainerDied","Data":"4fe8d37616588503394b5e0c543034c9d75405c62d38c6a3b4276c05a61d4d46"} Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-ll7zk" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.882272 4883 scope.go:117] "RemoveContainer" containerID="ab3d26f0cc1f7b7d4aa73624b5d0a56d07f024661893458f62e11a314a4cd5c5" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.906222 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-w5q6s" podStartSLOduration=2.408408982 podStartE2EDuration="6.906211401s" podCreationTimestamp="2026-03-10 09:22:21 +0000 UTC" firstStartedPulling="2026-03-10 09:22:22.708995526 +0000 UTC m=+1128.963893415" lastFinishedPulling="2026-03-10 09:22:27.206797955 +0000 UTC m=+1133.461695834" observedRunningTime="2026-03-10 09:22:27.905101417 +0000 UTC m=+1134.159999306" watchObservedRunningTime="2026-03-10 09:22:27.906211401 +0000 UTC m=+1134.161109290" Mar 10 09:22:27 crc kubenswrapper[4883]: I0310 09:22:27.928638 4883 scope.go:117] "RemoveContainer" containerID="e241856c7e2d9bdc80ba8f22b6df1569df773a3d438f85a0d6ce70af2f1197e4" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.239320 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") pod \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\" (UID: \"7b8b44ba-0853-4287-a0ba-ef1607c66d7b\") " Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.239757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config" (OuterVolumeSpecName: "config") pod "7b8b44ba-0853-4287-a0ba-ef1607c66d7b" (UID: "7b8b44ba-0853-4287-a0ba-ef1607c66d7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.341381 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b8b44ba-0853-4287-a0ba-ef1607c66d7b-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.513829 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:28 crc kubenswrapper[4883]: I0310 09:22:28.519269 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-ll7zk"] Mar 10 09:22:29 crc kubenswrapper[4883]: I0310 09:22:29.900835 4883 generic.go:334] "Generic (PLEG): container finished" podID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerID="e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979" exitCode=0 Mar 10 09:22:29 crc kubenswrapper[4883]: I0310 09:22:29.900946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerDied","Data":"e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979"} Mar 10 09:22:30 crc kubenswrapper[4883]: I0310 09:22:30.091109 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" path="/var/lib/kubelet/pods/7b8b44ba-0853-4287-a0ba-ef1607c66d7b/volumes" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.199446 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.395846 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.396119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.396239 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") pod \"96942836-243a-48c5-be3d-5eb5e5f166d0\" (UID: \"96942836-243a-48c5-be3d-5eb5e5f166d0\") " Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.402802 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl" (OuterVolumeSpecName: "kube-api-access-s5vkl") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "kube-api-access-s5vkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.423267 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.440453 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data" (OuterVolumeSpecName: "config-data") pod "96942836-243a-48c5-be3d-5eb5e5f166d0" (UID: "96942836-243a-48c5-be3d-5eb5e5f166d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.498514 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.499445 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vkl\" (UniqueName: \"kubernetes.io/projected/96942836-243a-48c5-be3d-5eb5e5f166d0-kube-api-access-s5vkl\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.499487 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96942836-243a-48c5-be3d-5eb5e5f166d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925544 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5q6s" event={"ID":"96942836-243a-48c5-be3d-5eb5e5f166d0","Type":"ContainerDied","Data":"be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7"} Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925609 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be6c6a8d52e4e43dcf7deb7dc6d88e3b0cf1816a00b3324c09c224772a4f6fa7" Mar 10 09:22:31 crc kubenswrapper[4883]: I0310 09:22:31.925671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5q6s" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157096 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157617 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157659 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157707 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157715 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157728 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157734 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157769 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157778 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157784 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157805 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157812 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157837 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157845 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157853 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157860 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: E0310 09:22:32.157880 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="init" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.157886 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="init" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158234 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158259 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158269 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8b44ba-0853-4287-a0ba-ef1607c66d7b" containerName="dnsmasq-dns" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158277 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158290 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" containerName="mariadb-account-create-update" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158297 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158309 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" containerName="mariadb-database-create" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.158317 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" containerName="keystone-db-sync" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.159338 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.172139 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.202350 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.203920 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210524 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210562 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210637 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210852 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.210986 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216546 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216739 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216783 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216812 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216852 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.216954 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217083 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217152 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.217166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.220392 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318284 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318362 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318404 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318489 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.318707 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319622 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319242 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319554 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319696 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319302 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.319931 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.320014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.326344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.329246 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.333468 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.334059 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.343089 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.350352 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"dnsmasq-dns-5985c59c55-97zkl\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.353950 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.356124 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360237 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360517 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.360541 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-n8r6x" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.363529 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.364627 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"keystone-bootstrap-2fk5l\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.376096 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.410018 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.415262 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.420851 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421167 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421300 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-prwrq" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421354 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421513 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421593 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421722 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.421880 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.443764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.455386 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.456277 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.458359 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vpjch" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.458611 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.460356 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.474159 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.475871 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523647 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523759 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523872 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523902 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523919 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523966 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.523988 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524010 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524111 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524137 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.524156 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.525315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.530948 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.537397 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.537787 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.543785 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.544339 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.544694 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.545598 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.546496 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.546609 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.550292 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.551648 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.572876 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.573963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577170 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577439 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577651 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4r2q" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.577865 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.581139 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.594130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"horizon-798c4d5785-ftwkg\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.603015 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"cinder-db-sync-x2hf5\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.623148 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625827 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625971 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.625992 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626101 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626123 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.626176 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.633872 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.635531 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.653326 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.654988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.657767 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.657975 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2mjf" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.666930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"neutron-db-sync-kcmgr\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.708653 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.731896 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732256 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732868 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.732917 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733229 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733532 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733633 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.733958 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.734190 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.734212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.735375 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.749626 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.751543 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.752155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.773135 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.781722 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.790099 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.794985 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.797109 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.803877 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"horizon-6889f87769-6j4vp\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.804140 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.810185 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4s72j" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.812823 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.836020 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841321 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841400 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841413 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841461 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841505 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841553 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841587 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.841612 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.842873 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.842972 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.843944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.844505 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.845952 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.851832 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.851967 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.854601 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.855683 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.855896 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.866256 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.867882 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.868925 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.870873 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.873567 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.877327 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.879137 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.879219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.881102 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.887607 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.890784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.904038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.936057 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945395 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945446 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945492 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945512 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945537 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945606 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945626 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945646 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945667 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945681 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945722 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945740 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945758 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945776 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945793 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945817 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.945839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946422 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.946584 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.947173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948109 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948176 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948210 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948235 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.948267 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.951526 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.953012 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:32 crc kubenswrapper[4883]: I0310 09:22:32.977076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"barbican-db-sync-jt8bs\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.009654 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050772 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050836 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050865 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050890 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050912 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050936 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.050975 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051087 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051112 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051130 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051157 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051175 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051242 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051263 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051286 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051772 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.051928 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.052813 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.053159 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.054067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.054871 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.055612 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.056427 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.058709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.066553 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.067995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.068667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.070357 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.072073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.073794 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.074196 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077692 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077737 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077805 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.077885 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.078629 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.080328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.082955 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"dnsmasq-dns-ccd7c9f8f-6pthc\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.083659 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.084147 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"placement-db-sync-v9wqz\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.104012 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.113706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.190381 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.223978 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.229155 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.250027 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.250909 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:33 crc kubenswrapper[4883]: W0310 09:22:33.253750 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cc1576_7d04_425c_a8a7_06ee1e8f00ce.slice/crio-fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071 WatchSource:0}: Error finding container fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071: Status 404 returned error can't find the container with id fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071 Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.279536 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.529237 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.539562 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.558438 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:22:33 crc kubenswrapper[4883]: W0310 09:22:33.622769 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a25758_cb77_448b_a856_3dbc6df2bc21.slice/crio-c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e WatchSource:0}: Error finding container c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e: Status 404 returned error can't find the container with id c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.671809 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.880524 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.925339 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974781 4883 generic.go:334] "Generic (PLEG): container finished" podID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerID="b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7" exitCode=0 Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974875 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerDied","Data":"b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.974927 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerStarted","Data":"fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.977850 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"3484a773396c34bf3e6006181f0ce251b988a8818cc3d0e090ab197865e41fae"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.980067 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerStarted","Data":"bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.981138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"b811a98b9eb641299a8060183de1afd8f605b6eb8c5e07f91e568070217a7cad"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.988689 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6889f87769-6j4vp" event={"ID":"a3de0c9e-752c-487f-934f-170386d5f462","Type":"ContainerStarted","Data":"be7e77d7db4925cebe572b32a2b1ac5989560fd1627d674ac963510739fff6e0"} Mar 10 09:22:33 crc kubenswrapper[4883]: I0310 09:22:33.997368 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerStarted","Data":"c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.001867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerStarted","Data":"11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.001890 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerStarted","Data":"e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525"} Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.016103 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kcmgr" podStartSLOduration=2.016086114 podStartE2EDuration="2.016086114s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:34.009186945 +0000 UTC m=+1140.264084834" watchObservedRunningTime="2026-03-10 09:22:34.016086114 +0000 UTC m=+1140.270984003" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.029257 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2fk5l" podStartSLOduration=2.029242623 podStartE2EDuration="2.029242623s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:34.025862069 +0000 UTC m=+1140.280759958" watchObservedRunningTime="2026-03-10 09:22:34.029242623 +0000 UTC m=+1140.284140512" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.057415 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.183053 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:22:34 crc kubenswrapper[4883]: W0310 09:22:34.198517 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87232af6_dc87_4f68_8b1f_850fd98219a8.slice/crio-340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf WatchSource:0}: Error finding container 340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf: Status 404 returned error can't find the container with id 340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.202055 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.251639 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.447735 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523376 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523450 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523483 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523544 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.523628 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") pod \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\" (UID: \"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce\") " Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.533050 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.558078 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.573153 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7" (OuterVolumeSpecName: "kube-api-access-4gkb7") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "kube-api-access-4gkb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.575518 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:34 crc kubenswrapper[4883]: E0310 09:22:34.576257 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.576280 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.576637 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" containerName="init" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.583564 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.620424 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.635537 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkb7\" (UniqueName: \"kubernetes.io/projected/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-kube-api-access-4gkb7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.650556 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.659560 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.678592 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.691102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config" (OuterVolumeSpecName: "config") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.693128 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.694077 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.727913 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" (UID: "c5cc1576-7d04-425c-a8a7-06ee1e8f00ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738681 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738756 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738824 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738842 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738852 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738862 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.738871 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841389 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841522 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841595 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.841623 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.842228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.842896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.843602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.849149 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.856275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"horizon-54485c5c-cpvdz\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:34 crc kubenswrapper[4883]: I0310 09:22:34.865884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.032657 4883 generic.go:334] "Generic (PLEG): container finished" podID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerID="e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2" exitCode=0 Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.032755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.033017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerStarted","Data":"597e041b53a7dcfb1def658755f45ca307eb7a79b514a35cb1ad87244e150850"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.038608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" event={"ID":"c5cc1576-7d04-425c-a8a7-06ee1e8f00ce","Type":"ContainerDied","Data":"fe85edaca127e9267ab954be5711233da1e83a78c7ac0a1124d47f6a6d59a071"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045248 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-97zkl" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.045302 4883 scope.go:117] "RemoveContainer" containerID="b6bf873909c8efbafc4e6b27c6fb3f0194fc7bb6f96a1e76fb3a715d4f30fda7" Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.049909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerStarted","Data":"a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.057542 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.098846 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerStarted","Data":"1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.105956 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.105987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"d2f4c6a75b56fe5a8d7adbca419f37ccbde86d92a924952b6cc9449838c4a617"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.111358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerStarted","Data":"50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb"} Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.122527 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.130158 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-97zkl"] Mar 10 09:22:35 crc kubenswrapper[4883]: I0310 09:22:35.387807 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.098073 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc1576-7d04-425c-a8a7-06ee1e8f00ce" path="/var/lib/kubelet/pods/c5cc1576-7d04-425c-a8a7-06ee1e8f00ce/volumes" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135193 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" containerID="cri-o://49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerStarted","Data":"1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.135275 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" containerID="cri-o://1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.144786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54485c5c-cpvdz" event={"ID":"6dde4634-36a7-4142-99e2-9a43f0fbe3fd","Type":"ContainerStarted","Data":"c40847d7eb6f2a5c70938469d31c3d3e46cacefe9ad418b40434e9f2175988e6"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.157494 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.157461408 podStartE2EDuration="4.157461408s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.154759083 +0000 UTC m=+1142.409656972" watchObservedRunningTime="2026-03-10 09:22:36.157461408 +0000 UTC m=+1142.412359297" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162455 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" containerID="cri-o://883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162806 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" containerID="cri-o://8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" gracePeriod=30 Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.162508 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerStarted","Data":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.172692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerStarted","Data":"368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea"} Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.172730 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.184533 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.184520543 podStartE2EDuration="4.184520543s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.183154788 +0000 UTC m=+1142.438052667" watchObservedRunningTime="2026-03-10 09:22:36.184520543 +0000 UTC m=+1142.439418433" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.222245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podStartSLOduration=4.222222338 podStartE2EDuration="4.222222338s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:22:36.205014438 +0000 UTC m=+1142.459912327" watchObservedRunningTime="2026-03-10 09:22:36.222222338 +0000 UTC m=+1142.477120227" Mar 10 09:22:36 crc kubenswrapper[4883]: E0310 09:22:36.313299 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8879ef_c406_48f4_a67a_d5cedcf8e886.slice/crio-8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8879ef_c406_48f4_a67a_d5cedcf8e886.slice/crio-conmon-8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded02e2ef_66b0_48ae_b81b_20ac50baa423.slice/crio-conmon-49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded02e2ef_66b0_48ae_b81b_20ac50baa423.slice/crio-49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.832596 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907020 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907068 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907107 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907219 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907239 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907273 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907392 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.907491 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") pod \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\" (UID: \"ff8879ef-c406-48f4-a67a-d5cedcf8e886\") " Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs" (OuterVolumeSpecName: "logs") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908273 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908673 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.908693 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff8879ef-c406-48f4-a67a-d5cedcf8e886-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.915508 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.917688 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj" (OuterVolumeSpecName: "kube-api-access-z76vj") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "kube-api-access-z76vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.944435 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts" (OuterVolumeSpecName: "scripts") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.950587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.969323 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data" (OuterVolumeSpecName: "config-data") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:36 crc kubenswrapper[4883]: I0310 09:22:36.990080 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff8879ef-c406-48f4-a67a-d5cedcf8e886" (UID: "ff8879ef-c406-48f4-a67a-d5cedcf8e886"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010081 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010107 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010118 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010127 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z76vj\" (UniqueName: \"kubernetes.io/projected/ff8879ef-c406-48f4-a67a-d5cedcf8e886-kube-api-access-z76vj\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010159 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.010168 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8879ef-c406-48f4-a67a-d5cedcf8e886-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.030466 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.112371 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.182968 4883 generic.go:334] "Generic (PLEG): container finished" podID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerID="11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21" exitCode=0 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.183074 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerDied","Data":"11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.186950 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerID="1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" exitCode=0 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.186976 4883 generic.go:334] "Generic (PLEG): container finished" podID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerID="49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.187033 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.187102 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190536 4883 generic.go:334] "Generic (PLEG): container finished" podID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190557 4883 generic.go:334] "Generic (PLEG): container finished" podID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" exitCode=143 Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190630 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190793 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.190720 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.191121 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff8879ef-c406-48f4-a67a-d5cedcf8e886","Type":"ContainerDied","Data":"d2f4c6a75b56fe5a8d7adbca419f37ccbde86d92a924952b6cc9449838c4a617"} Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.254874 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.264053 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.265390 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.279340 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.279965 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.279986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.279998 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280006 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280436 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-log" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.280460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" containerName="glance-httpd" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.281826 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.284550 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.284940 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.286698 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.304523 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.305210 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.305244 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} err="failed to get container status \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.306182 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: E0310 09:22:37.308565 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.308609 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} err="failed to get container status \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.308635 4883 scope.go:117] "RemoveContainer" containerID="8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309160 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb"} err="failed to get container status \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": rpc error: code = NotFound desc = could not find container \"8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb\": container with ID starting with 8a93485ca0acb9e9426361e6d856bd8363fb01b54abd2032ecadeebb83feeedb not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309189 4883 scope.go:117] "RemoveContainer" containerID="883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.309630 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f"} err="failed to get container status \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": rpc error: code = NotFound desc = could not find container \"883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f\": container with ID starting with 883ed6f9fd3ec4ffadcc6ff17a415dcd4e01e4c042731f15fec30371b309532f not found: ID does not exist" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.424306 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.424720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425094 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425173 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425296 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425364 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425521 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.425586 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527858 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.527887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528141 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528191 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528449 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.528950 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.535002 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.536947 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537101 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537111 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.537759 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.543191 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.545969 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.563403 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.605970 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:22:37 crc kubenswrapper[4883]: I0310 09:22:37.820291 4883 scope.go:117] "RemoveContainer" containerID="3c695c439ea9e2ad9e771b20e1905dadd59374ebe052ec433b69ea1e82161c99" Mar 10 09:22:38 crc kubenswrapper[4883]: I0310 09:22:38.091380 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8879ef-c406-48f4-a67a-d5cedcf8e886" path="/var/lib/kubelet/pods/ff8879ef-c406-48f4-a67a-d5cedcf8e886/volumes" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.889986 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.924830 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.937806 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.939140 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.942008 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.964515 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997193 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997368 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997409 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997457 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:40 crc kubenswrapper[4883]: I0310 09:22:40.997637 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.036566 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.068515 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.070071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.090831 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119655 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119741 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119771 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119903 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119938 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.119964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.124825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.125408 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.129240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.130539 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.131041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.132020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.159281 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"horizon-5fcc9bbb48-lf4jb\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222737 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222802 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222836 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.222859 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.223538 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.223686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.261036 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326956 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.326981 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327083 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327136 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327187 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327230 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327750 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef0598ad-c7ea-4645-b553-7d9028397156-logs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.327862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-scripts\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.328939 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef0598ad-c7ea-4645-b553-7d9028397156-config-data\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.330763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-secret-key\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.331041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-combined-ca-bundle\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.331944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0598ad-c7ea-4645-b553-7d9028397156-horizon-tls-certs\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.344054 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbr7\" (UniqueName: \"kubernetes.io/projected/ef0598ad-c7ea-4645-b553-7d9028397156-kube-api-access-7mbr7\") pod \"horizon-555c96ddb-t7tcm\" (UID: \"ef0598ad-c7ea-4645-b553-7d9028397156\") " pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.394194 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.845981 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940489 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940561 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.940625 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941351 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.941400 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") pod \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\" (UID: \"b0f9ff35-e1f8-4a30-9012-092af1e8ab09\") " Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.946528 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.946887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9" (OuterVolumeSpecName: "kube-api-access-pkgl9") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "kube-api-access-pkgl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.947902 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.950054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts" (OuterVolumeSpecName: "scripts") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.964932 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data" (OuterVolumeSpecName: "config-data") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:41 crc kubenswrapper[4883]: I0310 09:22:41.966120 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0f9ff35-e1f8-4a30-9012-092af1e8ab09" (UID: "b0f9ff35-e1f8-4a30-9012-092af1e8ab09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044661 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044697 4883 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044710 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044722 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkgl9\" (UniqueName: \"kubernetes.io/projected/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-kube-api-access-pkgl9\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044735 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.044743 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b0f9ff35-e1f8-4a30-9012-092af1e8ab09-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2fk5l" event={"ID":"b0f9ff35-e1f8-4a30-9012-092af1e8ab09","Type":"ContainerDied","Data":"e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525"} Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258590 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e0e3d6fe3e0b81dac2ef5b9e39c718ea08b72359b689574cb229810532f525" Mar 10 09:22:42 crc kubenswrapper[4883]: I0310 09:22:42.258625 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2fk5l" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.014085 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.023496 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2fk5l"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.115587 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:43 crc kubenswrapper[4883]: E0310 09:22:43.116035 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.116051 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.116293 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" containerName="keystone-bootstrap" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.118616 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120417 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120690 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.120868 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.121110 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.121942 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.269560 4883 generic.go:334] "Generic (PLEG): container finished" podID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerID="50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb" exitCode=0 Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.269655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerDied","Data":"50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb"} Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.274763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.274906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275263 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.275322 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.282610 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.350108 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.350364 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" containerID="cri-o://751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" gracePeriod=10 Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377387 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377652 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.377950 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.378037 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.378096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.381797 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.382038 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.382131 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.390707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.396834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.398005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"keystone-bootstrap-n7f74\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:43 crc kubenswrapper[4883]: I0310 09:22:43.437607 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.091749 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f9ff35-e1f8-4a30-9012-092af1e8ab09" path="/var/lib/kubelet/pods/b0f9ff35-e1f8-4a30-9012-092af1e8ab09/volumes" Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.282889 4883 generic.go:334] "Generic (PLEG): container finished" podID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerID="751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" exitCode=0 Mar 10 09:22:44 crc kubenswrapper[4883]: I0310 09:22:44.282959 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2"} Mar 10 09:22:46 crc kubenswrapper[4883]: I0310 09:22:46.386044 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Mar 10 09:22:47 crc kubenswrapper[4883]: I0310 09:22:47.449043 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:22:47 crc kubenswrapper[4883]: I0310 09:22:47.449128 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.285073 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.286053 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n99h694h655h66fhfdh58fh66bh58ch5f4h5b7h6ch97hc5h644h5fbh554hbch646h68fh74h64fh5d4h5c7h5fdh5b9h5cdh56h5bdh5d6h7dhbh567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24tkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6889f87769-6j4vp_openstack(a3de0c9e-752c-487f-934f-170386d5f462): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.288189 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-6889f87769-6j4vp" podUID="a3de0c9e-752c-487f-934f-170386d5f462" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.291672 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.291934 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6bh648h56chd4hbh574h586h5cdh659h566h5c8h75h79h65dh58bh5cdh684h595hb5h5d7hb6h66dh578h575h6ch544h5dbh54fh544hb6h558h686q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b49zt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54485c5c-cpvdz_openstack(6dde4634-36a7-4142-99e2-9a43f0fbe3fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.295019 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-54485c5c-cpvdz" podUID="6dde4634-36a7-4142-99e2-9a43f0fbe3fd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.345606 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.356143 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kcmgr" event={"ID":"f5a25758-cb77-448b-a856-3dbc6df2bc21","Type":"ContainerDied","Data":"c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e"} Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407157 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85ef316b645cd5cbb2d862d97a1736c90cb354713f7f2807afbf6d10211284e" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.407229 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kcmgr" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410510 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed02e2ef-66b0-48ae-b81b-20ac50baa423","Type":"ContainerDied","Data":"3484a773396c34bf3e6006181f0ce251b988a8818cc3d0e090ab197865e41fae"} Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410562 4883 scope.go:117] "RemoveContainer" containerID="1eff90bb60bf17fc9b64fa1d0a5ad1ed0278d91c9ac3bff3d2ef03b8ed5135f1" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.410807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438083 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438205 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438242 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438263 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438299 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") pod \"f5a25758-cb77-448b-a856-3dbc6df2bc21\" (UID: \"f5a25758-cb77-448b-a856-3dbc6df2bc21\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438337 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438525 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438623 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438672 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438700 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") pod \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\" (UID: \"ed02e2ef-66b0-48ae-b81b-20ac50baa423\") " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.438720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs" (OuterVolumeSpecName: "logs") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.439325 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.441102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.454727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts" (OuterVolumeSpecName: "scripts") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.454815 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.464737 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv" (OuterVolumeSpecName: "kube-api-access-ksmfv") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "kube-api-access-ksmfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.465079 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7" (OuterVolumeSpecName: "kube-api-access-4slp7") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "kube-api-access-4slp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.469877 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config" (OuterVolumeSpecName: "config") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.474798 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5a25758-cb77-448b-a856-3dbc6df2bc21" (UID: "f5a25758-cb77-448b-a856-3dbc6df2bc21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.480102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.496886 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.505380 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data" (OuterVolumeSpecName: "config-data") pod "ed02e2ef-66b0-48ae-b81b-20ac50baa423" (UID: "ed02e2ef-66b0-48ae-b81b-20ac50baa423"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544301 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed02e2ef-66b0-48ae-b81b-20ac50baa423-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544355 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544370 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544380 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544390 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksmfv\" (UniqueName: \"kubernetes.io/projected/ed02e2ef-66b0-48ae-b81b-20ac50baa423-kube-api-access-ksmfv\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544402 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slp7\" (UniqueName: \"kubernetes.io/projected/f5a25758-cb77-448b-a856-3dbc6df2bc21-kube-api-access-4slp7\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544411 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544420 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5a25758-cb77-448b-a856-3dbc6df2bc21-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544428 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.544437 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed02e2ef-66b0-48ae-b81b-20ac50baa423-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.559188 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.650408 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.753326 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.759367 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791088 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791602 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791623 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791649 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791656 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: E0310 09:22:53.791666 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791674 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791837 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-log" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791856 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" containerName="glance-httpd" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.791864 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" containerName="neutron-db-sync" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.792930 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.794854 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.796065 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.805872 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.956931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957343 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957403 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957436 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957489 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957641 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:53 crc kubenswrapper[4883]: I0310 09:22:53.957757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059670 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059825 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059889 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059917 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059937 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059969 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060062 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060095 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.059914 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060736 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.060814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.066444 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.066599 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.067021 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.068085 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.076112 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.080081 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.089921 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed02e2ef-66b0-48ae-b81b-20ac50baa423" path="/var/lib/kubelet/pods/ed02e2ef-66b0-48ae-b81b-20ac50baa423/volumes" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.112760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.514258 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.517763 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.536276 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.600446 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.603028 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.609936 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vpjch" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.610157 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.610899 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.617154 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.655596 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.677951 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.678711 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679014 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679151 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.679456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781190 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781225 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781256 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781374 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781394 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.781412 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.782227 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.783256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.783848 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.784323 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.785077 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.802784 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"dnsmasq-dns-7859c7799c-nf489\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.844221 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883394 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883530 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.883689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.888246 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.888874 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.891621 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.894681 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.896945 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"neutron-f6f8846bd-rdwfd\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:54 crc kubenswrapper[4883]: I0310 09:22:54.923834 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.386573 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.610441 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.611860 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.614421 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.614743 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.618951 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621121 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621168 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621200 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621261 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621287 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621317 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.621361 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723530 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723615 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723676 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723705 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723739 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.723786 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.729852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.730113 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.740453 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.741889 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.742000 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.742879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.743048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"neutron-5bc48b486f-2j899\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:56 crc kubenswrapper[4883]: I0310 09:22:56.941874 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.317534 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.317990 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6975,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-v9wqz_openstack(78bfcd03-74e4-4238-ae81-043bc04105cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.319184 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-v9wqz" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.400818 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.451194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" event={"ID":"631a0fc2-de6d-4778-bce2-46b69c306e44","Type":"ContainerDied","Data":"5064013cb3a4c978ab4c5b11395e9bcf9761e80e64e6533a75fe3d587b5a3d7c"} Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.451219 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" Mar 10 09:22:57 crc kubenswrapper[4883]: E0310 09:22:57.452814 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3\\\"\"" pod="openstack/placement-db-sync-v9wqz" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.538805 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.538894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539049 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539780 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539822 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.539845 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") pod \"631a0fc2-de6d-4778-bce2-46b69c306e44\" (UID: \"631a0fc2-de6d-4778-bce2-46b69c306e44\") " Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.545357 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd" (OuterVolumeSpecName: "kube-api-access-dpsnd") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "kube-api-access-dpsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.575997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.579556 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.580797 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.583232 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config" (OuterVolumeSpecName: "config") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.587450 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "631a0fc2-de6d-4778-bce2-46b69c306e44" (UID: "631a0fc2-de6d-4778-bce2-46b69c306e44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642458 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642510 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642524 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642536 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642550 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsnd\" (UniqueName: \"kubernetes.io/projected/631a0fc2-de6d-4778-bce2-46b69c306e44-kube-api-access-dpsnd\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.642563 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631a0fc2-de6d-4778-bce2-46b69c306e44-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.795897 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:57 crc kubenswrapper[4883]: I0310 09:22:57.804395 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-scjqt"] Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.098356 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" path="/var/lib/kubelet/pods/631a0fc2-de6d-4778-bce2-46b69c306e44/volumes" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.573673 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.573947 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lhjm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-x2hf5_openstack(dc0b1d9d-7834-473a-a487-6f540c606706): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.576395 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-x2hf5" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.632385 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.638209 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.671915 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672129 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672189 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.672629 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts" (OuterVolumeSpecName: "scripts") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673243 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data" (OuterVolumeSpecName: "config-data") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673265 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data" (OuterVolumeSpecName: "config-data") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673383 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.673428 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674273 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs" (OuterVolumeSpecName: "logs") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674283 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts" (OuterVolumeSpecName: "scripts") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674377 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674543 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.674809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs" (OuterVolumeSpecName: "logs") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.675076 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") pod \"a3de0c9e-752c-487f-934f-170386d5f462\" (UID: \"a3de0c9e-752c-487f-934f-170386d5f462\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.675120 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") pod \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\" (UID: \"6dde4634-36a7-4142-99e2-9a43f0fbe3fd\") " Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676281 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676301 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3de0c9e-752c-487f-934f-170386d5f462-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676311 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676320 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676328 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3de0c9e-752c-487f-934f-170386d5f462-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.676336 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679560 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk" (OuterVolumeSpecName: "kube-api-access-24tkk") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "kube-api-access-24tkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679705 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt" (OuterVolumeSpecName: "kube-api-access-b49zt") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "kube-api-access-b49zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.679946 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a3de0c9e-752c-487f-934f-170386d5f462" (UID: "a3de0c9e-752c-487f-934f-170386d5f462"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.680071 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6dde4634-36a7-4142-99e2-9a43f0fbe3fd" (UID: "6dde4634-36a7-4142-99e2-9a43f0fbe3fd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779431 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49zt\" (UniqueName: \"kubernetes.io/projected/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-kube-api-access-b49zt\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779469 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a3de0c9e-752c-487f-934f-170386d5f462-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779499 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6dde4634-36a7-4142-99e2-9a43f0fbe3fd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.779508 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24tkk\" (UniqueName: \"kubernetes.io/projected/a3de0c9e-752c-487f-934f-170386d5f462-kube-api-access-24tkk\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.919856 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81" Mar 10 09:22:58 crc kubenswrapper[4883]: E0310 09:22:58.920010 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:57dfeeb1cb430ed73e6db471592cfb1a5f25d3d5c083f82d4a676f936978be81,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n589h648h56h679h5c5h567h59bhd7hbbh59dh58bh545h5c6h99h7bh77h55fh56chcfh98h587h67h5d5h554hd7h547h57fh5c9h556h546h584h57bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmmc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(87232af6-dc87-4f68-8b1f-850fd98219a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:22:58 crc kubenswrapper[4883]: I0310 09:22:58.965142 4883 scope.go:117] "RemoveContainer" containerID="49fe89955ce62ad3e05a60ac58b5be18c2e6c58d590ba75b86becfb6388ff922" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.067995 4883 scope.go:117] "RemoveContainer" containerID="751aa4bf679ccd2fb9310821003d8071e637bbcc3be2f295875a2abfad364ce2" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.142267 4883 scope.go:117] "RemoveContainer" containerID="e2b7245fe34a086ba653b170ba754d14297a7a81b375fe409da9fc2787c69d3f" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.480413 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54485c5c-cpvdz" event={"ID":"6dde4634-36a7-4142-99e2-9a43f0fbe3fd","Type":"ContainerDied","Data":"c40847d7eb6f2a5c70938469d31c3d3e46cacefe9ad418b40434e9f2175988e6"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.480865 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54485c5c-cpvdz" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.499360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.508043 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerStarted","Data":"655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.514969 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6889f87769-6j4vp" event={"ID":"a3de0c9e-752c-487f-934f-170386d5f462","Type":"ContainerDied","Data":"be7e77d7db4925cebe572b32a2b1ac5989560fd1627d674ac963510739fff6e0"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.518953 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.542976 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6889f87769-6j4vp" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.554926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9"} Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.555093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-798c4d5785-ftwkg" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" containerID="cri-o://c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" gracePeriod=30 Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.555217 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-798c4d5785-ftwkg" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" containerID="cri-o://0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" gracePeriod=30 Mar 10 09:22:59 crc kubenswrapper[4883]: E0310 09:22:59.569872 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-x2hf5" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.586802 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.605066 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-555c96ddb-t7tcm"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.622949 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.627973 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54485c5c-cpvdz"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.629213 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jt8bs" podStartSLOduration=2.686991563 podStartE2EDuration="27.629195334s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.973707762 +0000 UTC m=+1140.228605651" lastFinishedPulling="2026-03-10 09:22:58.915911533 +0000 UTC m=+1165.170809422" observedRunningTime="2026-03-10 09:22:59.553152491 +0000 UTC m=+1165.808050380" watchObservedRunningTime="2026-03-10 09:22:59.629195334 +0000 UTC m=+1165.884093222" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.670391 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-798c4d5785-ftwkg" podStartSLOduration=2.38174907 podStartE2EDuration="27.670372651s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.62300178 +0000 UTC m=+1139.877899669" lastFinishedPulling="2026-03-10 09:22:58.91162536 +0000 UTC m=+1165.166523250" observedRunningTime="2026-03-10 09:22:59.619037318 +0000 UTC m=+1165.873935207" watchObservedRunningTime="2026-03-10 09:22:59.670372651 +0000 UTC m=+1165.925270540" Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.671027 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.689638 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.693907 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6889f87769-6j4vp"] Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.698303 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:22:59 crc kubenswrapper[4883]: W0310 09:22:59.729840 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77895d16_8ad3_4edb_ae91_d807afd499b3.slice/crio-3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd WatchSource:0}: Error finding container 3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd: Status 404 returned error can't find the container with id 3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd Mar 10 09:22:59 crc kubenswrapper[4883]: I0310 09:22:59.798757 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.092936 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dde4634-36a7-4142-99e2-9a43f0fbe3fd" path="/var/lib/kubelet/pods/6dde4634-36a7-4142-99e2-9a43f0fbe3fd/volumes" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.093490 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3de0c9e-752c-487f-934f-170386d5f462" path="/var/lib/kubelet/pods/a3de0c9e-752c-487f-934f-170386d5f462/volumes" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.366706 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576165 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"174bee987ac064b45e697523f8e929df13b7b8a6d6f5713d0d159940e38b9e6c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576205 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"97c9a096d73de1c3f1ffacb4309122cf185624906c4b45132ebef04bebf6385c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.576214 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-555c96ddb-t7tcm" event={"ID":"ef0598ad-c7ea-4645-b553-7d9028397156","Type":"ContainerStarted","Data":"66d01928941e03c04f9704e829affc43d592c9977316f15f431dfc6ab92edf9c"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590663 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.590676 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerStarted","Data":"3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.591263 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.600956 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-555c96ddb-t7tcm" podStartSLOduration=19.600934309 podStartE2EDuration="19.600934309s" podCreationTimestamp="2026-03-10 09:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.595955771 +0000 UTC m=+1166.850853660" watchObservedRunningTime="2026-03-10 09:23:00.600934309 +0000 UTC m=+1166.855832189" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.602150 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.602180 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"98d045da7fa92d2ea6ec832a583b37763ca71714b8c37b66a7d614c5c8099df1"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.604427 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerStarted","Data":"607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.604463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerStarted","Data":"0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.633483 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bc48b486f-2j899" podStartSLOduration=4.633457808 podStartE2EDuration="4.633457808s" podCreationTimestamp="2026-03-10 09:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.611565757 +0000 UTC m=+1166.866463647" watchObservedRunningTime="2026-03-10 09:23:00.633457808 +0000 UTC m=+1166.888355698" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.639651 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n7f74" podStartSLOduration=17.639642782 podStartE2EDuration="17.639642782s" podCreationTimestamp="2026-03-10 09:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.635119813 +0000 UTC m=+1166.890017702" watchObservedRunningTime="2026-03-10 09:23:00.639642782 +0000 UTC m=+1166.894540681" Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.654327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerStarted","Data":"0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.668390 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerStarted","Data":"023eb4b942e99478ddc5c2302dbb0ec5737ecdcfe04fb54667164182410590d3"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673143 4883 generic.go:334] "Generic (PLEG): container finished" podID="bb1924b3-495e-4d89-a314-5cc86d567758" containerID="8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7" exitCode=0 Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673220 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.673251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerStarted","Data":"6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.685774 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.685802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"05151d51868c64168fd42dc513563a4dce7dbaa3cd8c62f4564dbc730c66c6a3"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.689737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"71b700d498775842c5c3b3c9b8a10f0292a828bd6a69eebba134bc667b2b5df6"} Mar 10 09:23:00 crc kubenswrapper[4883]: I0310 09:23:00.716663 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fcc9bbb48-lf4jb" podStartSLOduration=20.716649092 podStartE2EDuration="20.716649092s" podCreationTimestamp="2026-03-10 09:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:00.692790061 +0000 UTC m=+1166.947687950" watchObservedRunningTime="2026-03-10 09:23:00.716649092 +0000 UTC m=+1166.971546980" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.262084 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.262468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.388197 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75c886f8b5-scjqt" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.395344 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.395439 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.712402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerStarted","Data":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.714123 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" containerID="cri-o://2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" gracePeriod=30 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.716684 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" containerID="cri-o://14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" gracePeriod=30 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.722170 4883 generic.go:334] "Generic (PLEG): container finished" podID="6d78560f-1b01-4ac1-9c36-109595422d78" containerID="655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b" exitCode=0 Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.722253 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerDied","Data":"655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.724841 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.733889 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerStarted","Data":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.735303 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.739809 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.73979783 podStartE2EDuration="24.73979783s" podCreationTimestamp="2026-03-10 09:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.73260713 +0000 UTC m=+1167.987505020" watchObservedRunningTime="2026-03-10 09:23:01.73979783 +0000 UTC m=+1167.994695718" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.749529 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.761566 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerStarted","Data":"abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e"} Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.761603 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.773054 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f6f8846bd-rdwfd" podStartSLOduration=7.773044232 podStartE2EDuration="7.773044232s" podCreationTimestamp="2026-03-10 09:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.76760188 +0000 UTC m=+1168.022499769" watchObservedRunningTime="2026-03-10 09:23:01.773044232 +0000 UTC m=+1168.027942121" Mar 10 09:23:01 crc kubenswrapper[4883]: I0310 09:23:01.784741 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-nf489" podStartSLOduration=7.784721963 podStartE2EDuration="7.784721963s" podCreationTimestamp="2026-03-10 09:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:01.783961249 +0000 UTC m=+1168.038859138" watchObservedRunningTime="2026-03-10 09:23:01.784721963 +0000 UTC m=+1168.039619843" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.506432 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.683985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685547 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.685892 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686010 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686063 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686121 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"37cf08e9-e871-4798-8990-1c80f7776d8f\" (UID: \"37cf08e9-e871-4798-8990-1c80f7776d8f\") " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686728 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs" (OuterVolumeSpecName: "logs") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.686836 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.690847 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.699727 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2" (OuterVolumeSpecName: "kube-api-access-26kq2") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "kube-api-access-26kq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.708617 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts" (OuterVolumeSpecName: "scripts") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.709279 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.730224 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.756604 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data" (OuterVolumeSpecName: "config-data") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.772148 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37cf08e9-e871-4798-8990-1c80f7776d8f" (UID: "37cf08e9-e871-4798-8990-1c80f7776d8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787552 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787574 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787593 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787603 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37cf08e9-e871-4798-8990-1c80f7776d8f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787611 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787628 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787636 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37cf08e9-e871-4798-8990-1c80f7776d8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.787644 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26kq2\" (UniqueName: \"kubernetes.io/projected/37cf08e9-e871-4798-8990-1c80f7776d8f-kube-api-access-26kq2\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.790898 4883 generic.go:334] "Generic (PLEG): container finished" podID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerID="607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664" exitCode=0 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.790976 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerDied","Data":"607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794463 4883 generic.go:334] "Generic (PLEG): container finished" podID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" exitCode=0 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794514 4883 generic.go:334] "Generic (PLEG): container finished" podID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" exitCode=143 Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794566 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794621 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"37cf08e9-e871-4798-8990-1c80f7776d8f","Type":"ContainerDied","Data":"05151d51868c64168fd42dc513563a4dce7dbaa3cd8c62f4564dbc730c66c6a3"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794641 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.794776 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.812914 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.815717 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerStarted","Data":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.840179 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.840167193 podStartE2EDuration="9.840167193s" podCreationTimestamp="2026-03-10 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:02.838798972 +0000 UTC m=+1169.093696872" watchObservedRunningTime="2026-03-10 09:23:02.840167193 +0000 UTC m=+1169.095065082" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.857564 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.880523 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887176 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887599 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887618 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="init" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887662 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="init" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887685 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887692 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: E0310 09:23:02.887708 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887713 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887880 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-log" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887913 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="631a0fc2-de6d-4778-bce2-46b69c306e44" containerName="dnsmasq-dns" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.887931 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" containerName="glance-httpd" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.890159 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.890243 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.892184 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.913425 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.913718 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.992809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993167 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993382 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993506 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993571 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993662 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993690 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:02 crc kubenswrapper[4883]: I0310 09:23:02.993719 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.056174 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095068 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: E0310 09:23:03.095741 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095859 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095904 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095909 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095946 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095986 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.096004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.096029 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.095769 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} err="failed to get container status \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.099785 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.100485 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.100753 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: E0310 09:23:03.101741 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.101774 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} err="failed to get container status \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.101794 4883 scope.go:117] "RemoveContainer" containerID="14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.102665 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104153 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104601 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560"} err="failed to get container status \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": rpc error: code = NotFound desc = could not find container \"14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560\": container with ID starting with 14b487252b0cdb271eb0be4704e237a9c0377c7e6439e24871aa1998e4698560 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.104747 4883 scope.go:117] "RemoveContainer" containerID="2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.105208 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270"} err="failed to get container status \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": rpc error: code = NotFound desc = could not find container \"2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270\": container with ID starting with 2e382a16fe9ca3084e3a3372387a3ac5c073b8fd3ad6810d63737e53feb9f270 not found: ID does not exist" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.106639 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.114376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.116227 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.131501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.220389 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.222635 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302022 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302134 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.302536 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") pod \"6d78560f-1b01-4ac1-9c36-109595422d78\" (UID: \"6d78560f-1b01-4ac1-9c36-109595422d78\") " Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.308614 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.313989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8" (OuterVolumeSpecName: "kube-api-access-psns8") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "kube-api-access-psns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.335855 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d78560f-1b01-4ac1-9c36-109595422d78" (UID: "6d78560f-1b01-4ac1-9c36-109595422d78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405221 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405245 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d78560f-1b01-4ac1-9c36-109595422d78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.405255 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psns8\" (UniqueName: \"kubernetes.io/projected/6d78560f-1b01-4ac1-9c36-109595422d78-kube-api-access-psns8\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.740940 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828754 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jt8bs" event={"ID":"6d78560f-1b01-4ac1-9c36-109595422d78","Type":"ContainerDied","Data":"1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0"} Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828800 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecd75dacc64ce6638ff3cb4e1b982b79b4377e09b6132685f58b6c4877440c0" Mar 10 09:23:03 crc kubenswrapper[4883]: I0310 09:23:03.828905 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jt8bs" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.049834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:04 crc kubenswrapper[4883]: E0310 09:23:04.050355 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.050372 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.050624 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" containerName="barbican-db-sync" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.051643 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061317 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061714 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.061819 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2mjf" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.065771 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.067581 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.072992 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.146901 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149411 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149579 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.149926 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.194918 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cf08e9-e871-4798-8990-1c80f7776d8f" path="/var/lib/kubelet/pods/37cf08e9-e871-4798-8990-1c80f7776d8f/volumes" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195818 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195848 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195862 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.195879 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.227062 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.246886 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253054 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253495 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253543 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.253728 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254138 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.254782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.255507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.258642 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.264266 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.264691 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.269526 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.269827 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-nf489" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" containerID="cri-o://abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" gracePeriod=10 Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.278018 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.279284 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"barbican-worker-76f6589d69-9q47v\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.292516 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.294313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.300921 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.306396 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.307730 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.310315 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.314965 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358545 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358599 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.358681 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.359500 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.364830 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.366328 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.366985 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.379158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"barbican-keystone-listener-5d58859d7d-76mv6\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.429316 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460741 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460768 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.460953 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461022 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461042 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461143 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461168 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461196 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.461277 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.467431 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.562994 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563166 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563190 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563303 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563330 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.563348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564260 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564334 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.564619 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.565040 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.565243 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.567583 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.570631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.571287 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.579676 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"barbican-api-857bd77984-4wnsb\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.579817 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"dnsmasq-dns-8449d68f4f-tgr6b\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.749736 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.756353 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.838686 4883 generic.go:334] "Generic (PLEG): container finished" podID="bb1924b3-495e-4d89-a314-5cc86d567758" containerID="abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" exitCode=0 Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839449 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e"} Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839538 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:04 crc kubenswrapper[4883]: I0310 09:23:04.839553 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.545827 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.549856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.555874 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.556160 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.563354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728511 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728575 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728674 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728695 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728832 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.728854 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.831985 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832539 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832632 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.832814 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.833410 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.839826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.840718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.840897 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.841400 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.842130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.853145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"barbican-api-b5b6f75db-kgz55\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:06 crc kubenswrapper[4883]: I0310 09:23:06.874993 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.878413 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"4d280b27ae76beef2731a3863818dd720d9ca5f105e0f710f7b3f7d025052c9f"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.880493 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7f74" event={"ID":"5acdd73a-9879-4507-8f6d-10e2ad8065e4","Type":"ContainerDied","Data":"0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.880730 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d148133502c63f130313a1a9a41570703ee3c780cfc640a21d5ed94368c1fc2" Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.883617 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-nf489" event={"ID":"bb1924b3-495e-4d89-a314-5cc86d567758","Type":"ContainerDied","Data":"6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596"} Mar 10 09:23:07 crc kubenswrapper[4883]: I0310 09:23:07.883641 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6002e8450e8885431aac74b1842d182bc92a0a97fd9c5adce1cdbfe3e6e07596" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.149054 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.162693 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274322 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274437 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274497 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274515 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274540 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274608 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274681 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.274696 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") pod \"bb1924b3-495e-4d89-a314-5cc86d567758\" (UID: \"bb1924b3-495e-4d89-a314-5cc86d567758\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.275228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") pod \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\" (UID: \"5acdd73a-9879-4507-8f6d-10e2ad8065e4\") " Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.283833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6" (OuterVolumeSpecName: "kube-api-access-7mmp6") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "kube-api-access-7mmp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.289115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.292956 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts" (OuterVolumeSpecName: "scripts") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.297677 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.304394 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t" (OuterVolumeSpecName: "kube-api-access-27k8t") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "kube-api-access-27k8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.335337 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.371990 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380597 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380623 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mmp6\" (UniqueName: \"kubernetes.io/projected/5acdd73a-9879-4507-8f6d-10e2ad8065e4-kube-api-access-7mmp6\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380635 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380644 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380654 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380663 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27k8t\" (UniqueName: \"kubernetes.io/projected/bb1924b3-495e-4d89-a314-5cc86d567758-kube-api-access-27k8t\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380677 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.380686 4883 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.383049 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config" (OuterVolumeSpecName: "config") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.408885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb1924b3-495e-4d89-a314-5cc86d567758" (UID: "bb1924b3-495e-4d89-a314-5cc86d567758"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.411303 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data" (OuterVolumeSpecName: "config-data") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.412854 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5acdd73a-9879-4507-8f6d-10e2ad8065e4" (UID: "5acdd73a-9879-4507-8f6d-10e2ad8065e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.462433 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.466795 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce11c091_db9b_47fb_8427_dcaa2585a4c7.slice/crio-f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa WatchSource:0}: Error finding container f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa: Status 404 returned error can't find the container with id f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.483981 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484009 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484019 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb1924b3-495e-4d89-a314-5cc86d567758-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.484028 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acdd73a-9879-4507-8f6d-10e2ad8065e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.554623 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.565284 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.581434 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627e20df_c100_4dac_b344_efda8eda195a.slice/crio-c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed WatchSource:0}: Error finding container c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed: Status 404 returned error can't find the container with id c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed Mar 10 09:23:08 crc kubenswrapper[4883]: W0310 09:23:08.582313 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ac116f_f773_4b3f_a508_bc304668da18.slice/crio-867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db WatchSource:0}: Error finding container 867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db: Status 404 returned error can't find the container with id 867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.691360 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.714269 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.843081 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.869746 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.975420 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"840eb0e983eac3e58e23b21ee91762a03097be568c06b5951bfe32f90ffa8f08"} Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992620 4883 generic.go:334] "Generic (PLEG): container finished" podID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerID="611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b" exitCode=0 Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992684 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b"} Mar 10 09:23:08 crc kubenswrapper[4883]: I0310 09:23:08.992705 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerStarted","Data":"f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.008618 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"e6c22b5b53507d702e074cf63768ed9e237d5a5c16f54e7d10767483ac8e989f"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.023707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.031635 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.031694 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.045824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.075711 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7f74" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.077755 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.077793 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db"} Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.078837 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-nf489" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.194408 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.206681 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-nf489"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246247 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246806 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246820 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246827 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: E0310 09:23:09.246838 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="init" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.246844 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="init" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247087 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" containerName="keystone-bootstrap" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247111 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" containerName="dnsmasq-dns" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.247847 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.255868 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258151 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258251 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258531 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258771 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-92kmh" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.258893 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405324 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405364 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405676 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405779 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.405886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507184 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507236 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507292 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507316 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507403 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507444 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.507507 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.516790 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-scripts\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.516997 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-config-data\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.520826 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-combined-ca-bundle\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.523898 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-credential-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.525669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-public-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.526959 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-fernet-keys\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.531020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnv92\" (UniqueName: \"kubernetes.io/projected/c6effa97-6f88-4706-98bc-b51af01bd993-kube-api-access-mnv92\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.532868 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6effa97-6f88-4706-98bc-b51af01bd993-internal-tls-certs\") pod \"keystone-744f4576f6-kglt9\" (UID: \"c6effa97-6f88-4706-98bc-b51af01bd993\") " pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:09 crc kubenswrapper[4883]: I0310 09:23:09.576598 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.033175 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-744f4576f6-kglt9"] Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.092998 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1924b3-495e-4d89-a314-5cc86d567758" path="/var/lib/kubelet/pods/bb1924b3-495e-4d89-a314-5cc86d567758/volumes" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.096931 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerStarted","Data":"7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.098145 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.098181 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.101415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerStarted","Data":"25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.101924 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.112848 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerStarted","Data":"a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.121831 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerStarted","Data":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.122379 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.122415 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.129103 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b5b6f75db-kgz55" podStartSLOduration=4.129081758 podStartE2EDuration="4.129081758s" podCreationTimestamp="2026-03-10 09:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.120226539 +0000 UTC m=+1176.375124427" watchObservedRunningTime="2026-03-10 09:23:10.129081758 +0000 UTC m=+1176.383979646" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.147135 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" podStartSLOduration=6.147112627 podStartE2EDuration="6.147112627s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.139415413 +0000 UTC m=+1176.394313302" watchObservedRunningTime="2026-03-10 09:23:10.147112627 +0000 UTC m=+1176.402010516" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.178952 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.178923483 podStartE2EDuration="8.178923483s" podCreationTimestamp="2026-03-10 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.165752426 +0000 UTC m=+1176.420650315" watchObservedRunningTime="2026-03-10 09:23:10.178923483 +0000 UTC m=+1176.433821372" Mar 10 09:23:10 crc kubenswrapper[4883]: I0310 09:23:10.202395 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-857bd77984-4wnsb" podStartSLOduration=6.202370357 podStartE2EDuration="6.202370357s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:10.190899275 +0000 UTC m=+1176.445797155" watchObservedRunningTime="2026-03-10 09:23:10.202370357 +0000 UTC m=+1176.457268245" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.148049 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.148370 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerStarted","Data":"b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.152494 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.154930 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-744f4576f6-kglt9" event={"ID":"c6effa97-6f88-4706-98bc-b51af01bd993","Type":"ContainerStarted","Data":"6139fe2e637ef029d92f45bf936438fe03f3dfbd08565e1cb11592cee96f67f5"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.154959 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-744f4576f6-kglt9" event={"ID":"c6effa97-6f88-4706-98bc-b51af01bd993","Type":"ContainerStarted","Data":"66165c2a4a853e9120bb23d143e2a8d0459c59cf8b3c081ac55da2c5c1ff9daf"} Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.173631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76f6589d69-9q47v" podStartSLOduration=6.473579475 podStartE2EDuration="8.173611013s" podCreationTimestamp="2026-03-10 09:23:03 +0000 UTC" firstStartedPulling="2026-03-10 09:23:08.731346506 +0000 UTC m=+1174.986244384" lastFinishedPulling="2026-03-10 09:23:10.431378034 +0000 UTC m=+1176.686275922" observedRunningTime="2026-03-10 09:23:11.166838562 +0000 UTC m=+1177.421736451" watchObservedRunningTime="2026-03-10 09:23:11.173611013 +0000 UTC m=+1177.428508901" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.196405 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-744f4576f6-kglt9" podStartSLOduration=2.196391429 podStartE2EDuration="2.196391429s" podCreationTimestamp="2026-03-10 09:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:11.18875565 +0000 UTC m=+1177.443653539" watchObservedRunningTime="2026-03-10 09:23:11.196391429 +0000 UTC m=+1177.451289318" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.279553 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.312113 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.313685 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.329865 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.331040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.349947 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350033 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.350131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.356648 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.369152 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.399192 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-555c96ddb-t7tcm" podUID="ef0598ad-c7ea-4645-b553-7d9028397156" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.419873 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452501 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452730 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452852 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.452971 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453067 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453172 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453275 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221490bc-406a-436f-8705-66106ed6bbe0-logs\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.453464 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.458225 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.462922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data-custom\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.468501 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-config-data\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.472917 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221490bc-406a-436f-8705-66106ed6bbe0-combined-ca-bundle\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.475047 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.476105 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb2vv\" (UniqueName: \"kubernetes.io/projected/221490bc-406a-436f-8705-66106ed6bbe0-kube-api-access-lb2vv\") pod \"barbican-worker-5d48955d69-pvbn8\" (UID: \"221490bc-406a-436f-8705-66106ed6bbe0\") " pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.486998 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555234 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555318 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555393 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555488 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555532 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555703 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555796 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555840 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.555875 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.556009 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.556203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce7df3b-5f31-4732-8d27-8e06dc07824d-logs\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.559961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-combined-ca-bundle\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.561073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data-custom\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.566487 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce7df3b-5f31-4732-8d27-8e06dc07824d-config-data\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.575902 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqck4\" (UniqueName: \"kubernetes.io/projected/dce7df3b-5f31-4732-8d27-8e06dc07824d-kube-api-access-nqck4\") pod \"barbican-keystone-listener-7f87cd7fb6-jz6ch\" (UID: \"dce7df3b-5f31-4732-8d27-8e06dc07824d\") " pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.650645 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d48955d69-pvbn8" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658432 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658483 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658602 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658881 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.658990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17a46674-c6ec-4128-8285-d71c228d11c8-logs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.663056 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.663194 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data-custom\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.664993 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-public-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.665355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-internal-tls-certs\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.666724 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-config-data\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.668201 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a46674-c6ec-4128-8285-d71c228d11c8-combined-ca-bundle\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.672439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5mp\" (UniqueName: \"kubernetes.io/projected/17a46674-c6ec-4128-8285-d71c228d11c8-kube-api-access-sd5mp\") pod \"barbican-api-7b55764b68-l794s\" (UID: \"17a46674-c6ec-4128-8285-d71c228d11c8\") " pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:11 crc kubenswrapper[4883]: I0310 09:23:11.803958 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.179350 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d48955d69-pvbn8"] Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.194458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"b09a865ddd7a1b78c9c31ffc2f1b2e50541a572d4caa8b73de593c885e1724d8"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.201157 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerStarted","Data":"d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.221451 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerStarted","Data":"447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745"} Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.221542 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch"] Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.223630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.227154 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podStartSLOduration=6.080943312 podStartE2EDuration="8.227139247s" podCreationTimestamp="2026-03-10 09:23:04 +0000 UTC" firstStartedPulling="2026-03-10 09:23:08.731889469 +0000 UTC m=+1174.986787348" lastFinishedPulling="2026-03-10 09:23:10.878085404 +0000 UTC m=+1177.132983283" observedRunningTime="2026-03-10 09:23:12.22028404 +0000 UTC m=+1178.475181930" watchObservedRunningTime="2026-03-10 09:23:12.227139247 +0000 UTC m=+1178.482037137" Mar 10 09:23:12 crc kubenswrapper[4883]: W0310 09:23:12.229534 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddce7df3b_5f31_4732_8d27_8e06dc07824d.slice/crio-09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331 WatchSource:0}: Error finding container 09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331: Status 404 returned error can't find the container with id 09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331 Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.238334 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v9wqz" podStartSLOduration=2.763724858 podStartE2EDuration="40.238317757s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:34.092652535 +0000 UTC m=+1140.347550414" lastFinishedPulling="2026-03-10 09:23:11.567245424 +0000 UTC m=+1177.822143313" observedRunningTime="2026-03-10 09:23:12.23613474 +0000 UTC m=+1178.491032629" watchObservedRunningTime="2026-03-10 09:23:12.238317757 +0000 UTC m=+1178.493215645" Mar 10 09:23:12 crc kubenswrapper[4883]: I0310 09:23:12.314244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b55764b68-l794s"] Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.220781 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.221094 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.231317 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"830b49df936b02d19e6ae120338dd8e00727b41bd2ed719757de5b5c9c91bc0c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.231371 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d48955d69-pvbn8" event={"ID":"221490bc-406a-436f-8705-66106ed6bbe0","Type":"ContainerStarted","Data":"293426d63ccc27304336f106c307c54dd630999057afb9451e63075fe697aa48"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.233898 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"7d335d4784197fada2572dc8909148d75e141ba92964e9663bfde49a3713142d"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.234160 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"18e716d7c9c21fbf104b169821c50a69b6dce5fd0637a0d22aa31622a442a36c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.234194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b55764b68-l794s" event={"ID":"17a46674-c6ec-4128-8285-d71c228d11c8","Type":"ContainerStarted","Data":"62af63d0f1e089100b9be851d041522e73589ef25a611d1597ee11eeca30449a"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236188 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"11eba696e00d12ccbbb3d6ff0afa1f69ee6059e203e2827ba694d408742d8f2c"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236215 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"1bb3b108ebf5b9acbfe8dce08e655bdccc54926cdeb51524044f4bcc719141e1"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236245 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" event={"ID":"dce7df3b-5f31-4732-8d27-8e06dc07824d","Type":"ContainerStarted","Data":"09ad10ec780618f78419872c78b5eaf33bd53b32422fe584bbf68183e33f6331"} Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236416 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" containerID="cri-o://4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.236597 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" containerID="cri-o://7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.256496 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d48955d69-pvbn8" podStartSLOduration=2.25510016 podStartE2EDuration="2.25510016s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.246161556 +0000 UTC m=+1179.501059444" watchObservedRunningTime="2026-03-10 09:23:13.25510016 +0000 UTC m=+1179.509998050" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.275573 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.288619 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b55764b68-l794s" podStartSLOduration=2.2886027860000002 podStartE2EDuration="2.288602786s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.264025551 +0000 UTC m=+1179.518923430" watchObservedRunningTime="2026-03-10 09:23:13.288602786 +0000 UTC m=+1179.543500674" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320213 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320437 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76f6589d69-9q47v" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" containerID="cri-o://b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.320872 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-76f6589d69-9q47v" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" containerID="cri-o://8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" gracePeriod=30 Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.326955 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.332709 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.336313 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f87cd7fb6-jz6ch" podStartSLOduration=2.336291101 podStartE2EDuration="2.336291101s" podCreationTimestamp="2026-03-10 09:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:13.291095737 +0000 UTC m=+1179.545993626" watchObservedRunningTime="2026-03-10 09:23:13.336291101 +0000 UTC m=+1179.591188991" Mar 10 09:23:13 crc kubenswrapper[4883]: I0310 09:23:13.353198 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.266463 4883 generic.go:334] "Generic (PLEG): container finished" podID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerID="447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745" exitCode=0 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.266725 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerDied","Data":"447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275134 4883 generic.go:334] "Generic (PLEG): container finished" podID="13211306-6813-40e0-91c2-5f8ac43968c6" containerID="8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" exitCode=0 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275176 4883 generic.go:334] "Generic (PLEG): container finished" podID="13211306-6813-40e0-91c2-5f8ac43968c6" containerID="b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" exitCode=143 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275194 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.275244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.281732 4883 generic.go:334] "Generic (PLEG): container finished" podID="627e20df-c100-4dac-b344-efda8eda195a" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" exitCode=143 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.284440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286093 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" containerID="cri-o://f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" gracePeriod=30 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286394 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" containerID="cri-o://d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" gracePeriod=30 Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.286527 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291098 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291117 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.291129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.752697 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.831688 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:14 crc kubenswrapper[4883]: I0310 09:23:14.831955 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" containerID="cri-o://368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" gracePeriod=10 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297539 4883 generic.go:334] "Generic (PLEG): container finished" podID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerID="d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" exitCode=0 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297965 4883 generic.go:334] "Generic (PLEG): container finished" podID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerID="f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" exitCode=143 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.297742 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb"} Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.298100 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7"} Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.301764 4883 generic.go:334] "Generic (PLEG): container finished" podID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerID="368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" exitCode=0 Mar 10 09:23:15 crc kubenswrapper[4883]: I0310 09:23:15.303113 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea"} Mar 10 09:23:16 crc kubenswrapper[4883]: I0310 09:23:16.219343 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:23:16 crc kubenswrapper[4883]: I0310 09:23:16.223554 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:23:17 crc kubenswrapper[4883]: I0310 09:23:17.449494 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:23:17 crc kubenswrapper[4883]: I0310 09:23:17.449831 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.106630 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.281088 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.415346 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.478271 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:23:18 crc kubenswrapper[4883]: E0310 09:23:18.634696 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.649572 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.649688 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.651821 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") pod \"78bfcd03-74e4-4238-ae81-043bc04105cd\" (UID: \"78bfcd03-74e4-4238-ae81-043bc04105cd\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.652704 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs" (OuterVolumeSpecName: "logs") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.652979 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78bfcd03-74e4-4238-ae81-043bc04105cd-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.658523 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975" (OuterVolumeSpecName: "kube-api-access-x6975") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "kube-api-access-x6975". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.659365 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts" (OuterVolumeSpecName: "scripts") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.673918 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:60948->10.217.0.161:9311: read: connection reset by peer" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.674200 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-857bd77984-4wnsb" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:60940->10.217.0.161:9311: read: connection reset by peer" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.684638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data" (OuterVolumeSpecName: "config-data") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.756895 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6975\" (UniqueName: \"kubernetes.io/projected/78bfcd03-74e4-4238-ae81-043bc04105cd-kube-api-access-x6975\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.757146 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.757157 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.762673 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78bfcd03-74e4-4238-ae81-043bc04105cd" (UID: "78bfcd03-74e4-4238-ae81-043bc04105cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.768696 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.856414 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.856847 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.878977 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879147 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879230 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.879298 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.880133 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78bfcd03-74e4-4238-ae81-043bc04105cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.881223 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs" (OuterVolumeSpecName: "logs") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.884723 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl" (OuterVolumeSpecName: "kube-api-access-zj2sl") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "kube-api-access-zj2sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.888681 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75" (OuterVolumeSpecName: "kube-api-access-jjg75") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "kube-api-access-jjg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.914154 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.956945 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.981342 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.981389 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982225 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982297 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982315 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") pod \"13211306-6813-40e0-91c2-5f8ac43968c6\" (UID: \"13211306-6813-40e0-91c2-5f8ac43968c6\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982362 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982383 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982399 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") pod \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\" (UID: \"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982418 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") pod \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\" (UID: \"329b0d09-6aa9-427f-8b4b-209bb2c6707b\") " Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982878 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13211306-6813-40e0-91c2-5f8ac43968c6-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982891 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982903 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj2sl\" (UniqueName: \"kubernetes.io/projected/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-kube-api-access-zj2sl\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982913 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjg75\" (UniqueName: \"kubernetes.io/projected/13211306-6813-40e0-91c2-5f8ac43968c6-kube-api-access-jjg75\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.982922 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.983996 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs" (OuterVolumeSpecName: "logs") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.985929 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.995808 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:18 crc kubenswrapper[4883]: I0310 09:23:18.995889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt" (OuterVolumeSpecName: "kube-api-access-nbzvt") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "kube-api-access-nbzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.084746 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.085279 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087290 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087322 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087332 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087344 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/329b0d09-6aa9-427f-8b4b-209bb2c6707b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087356 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbzvt\" (UniqueName: \"kubernetes.io/projected/329b0d09-6aa9-427f-8b4b-209bb2c6707b-kube-api-access-nbzvt\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.087365 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.109985 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.113836 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.117572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data" (OuterVolumeSpecName: "config-data") pod "329b0d09-6aa9-427f-8b4b-209bb2c6707b" (UID: "329b0d09-6aa9-427f-8b4b-209bb2c6707b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.121257 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data" (OuterVolumeSpecName: "config-data") pod "13211306-6813-40e0-91c2-5f8ac43968c6" (UID: "13211306-6813-40e0-91c2-5f8ac43968c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.146418 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config" (OuterVolumeSpecName: "config") pod "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" (UID: "1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190616 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190645 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/329b0d09-6aa9-427f-8b4b-209bb2c6707b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190659 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190670 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.190678 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13211306-6813-40e0-91c2-5f8ac43968c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.199833 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296161 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296227 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.296284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.301402 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.312612 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj" (OuterVolumeSpecName: "kube-api-access-h4rhj") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "kube-api-access-h4rhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.336824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.342688 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data" (OuterVolumeSpecName: "config-data") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v9wqz" event={"ID":"78bfcd03-74e4-4238-ae81-043bc04105cd","Type":"ContainerDied","Data":"a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377839 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v9wqz" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.377860 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fc6acc617749c8e1de867b10f66ead08446875d606f1370db6c642ea9067e0" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.379494 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerStarted","Data":"0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382141 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76f6589d69-9q47v" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76f6589d69-9q47v" event={"ID":"13211306-6813-40e0-91c2-5f8ac43968c6","Type":"ContainerDied","Data":"840eb0e983eac3e58e23b21ee91762a03097be568c06b5951bfe32f90ffa8f08"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.382278 4883 scope.go:117] "RemoveContainer" containerID="8695ad4b4edf6d710e9f86e52079c0ebcb1312875a2b7956f23e1e79e035e665" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.390960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" event={"ID":"329b0d09-6aa9-427f-8b4b-209bb2c6707b","Type":"ContainerDied","Data":"e6c22b5b53507d702e074cf63768ed9e237d5a5c16f54e7d10767483ac8e989f"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.391091 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d58859d7d-76mv6" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395781 4883 generic.go:334] "Generic (PLEG): container finished" podID="627e20df-c100-4dac-b344-efda8eda195a" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" exitCode=0 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395857 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-857bd77984-4wnsb" event={"ID":"627e20df-c100-4dac-b344-efda8eda195a","Type":"ContainerDied","Data":"c8cc80979c9fb422fb6df0d72e23560faaf12e9f1fbf0fcaea70b7f181ea66ed"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.395899 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-857bd77984-4wnsb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.397771 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") pod \"627e20df-c100-4dac-b344-efda8eda195a\" (UID: \"627e20df-c100-4dac-b344-efda8eda195a\") " Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.398981 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4rhj\" (UniqueName: \"kubernetes.io/projected/627e20df-c100-4dac-b344-efda8eda195a-kube-api-access-h4rhj\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399001 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399011 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.399021 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/627e20df-c100-4dac-b344-efda8eda195a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.404759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs" (OuterVolumeSpecName: "logs") pod "627e20df-c100-4dac-b344-efda8eda195a" (UID: "627e20df-c100-4dac-b344-efda8eda195a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.406291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" event={"ID":"1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499","Type":"ContainerDied","Data":"597e041b53a7dcfb1def658755f45ca307eb7a79b514a35cb1ad87244e150850"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.406425 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-6pthc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerStarted","Data":"82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21"} Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409198 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" containerID="cri-o://859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409414 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409467 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" containerID="cri-o://82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.409530 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" containerID="cri-o://68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" gracePeriod=30 Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.428963 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-x2hf5" podStartSLOduration=2.666803312 podStartE2EDuration="47.428934081s" podCreationTimestamp="2026-03-10 09:22:32 +0000 UTC" firstStartedPulling="2026-03-10 09:22:33.623041735 +0000 UTC m=+1139.877939624" lastFinishedPulling="2026-03-10 09:23:18.385172503 +0000 UTC m=+1184.640070393" observedRunningTime="2026-03-10 09:23:19.404133996 +0000 UTC m=+1185.659031876" watchObservedRunningTime="2026-03-10 09:23:19.428934081 +0000 UTC m=+1185.683831970" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.436972 4883 scope.go:117] "RemoveContainer" containerID="b1ec206adf1217262c3bfee94f02b30c425dada138bf01911ffc99b632986ffe" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.466529 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.466752 4883 scope.go:117] "RemoveContainer" containerID="d20e2851ee81ff0d94c886dd8f844a7b533eac90fb46cb8579dbbc2d2478bfbb" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.486092 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5d58859d7d-76mv6"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.490911 4883 scope.go:117] "RemoveContainer" containerID="f3498d2162eafd25a3496fdf4e5a77ae8f5d492b8653875841a51cf1fa9f94b7" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.495744 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.500611 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627e20df-c100-4dac-b344-efda8eda195a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.501290 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-76f6589d69-9q47v"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.519352 4883 scope.go:117] "RemoveContainer" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.544332 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.566242 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-6pthc"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603347 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603874 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603888 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603907 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603913 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603922 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603929 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603940 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603946 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603957 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603963 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603978 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.603984 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.603995 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="init" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="init" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.604013 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604019 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.604032 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604038 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604387 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604415 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" containerName="dnsmasq-dns" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604425 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604435 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604447 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" containerName="barbican-keystone-listener" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604465 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" containerName="barbican-worker" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604491 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" containerName="placement-db-sync" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.604500 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="627e20df-c100-4dac-b344-efda8eda195a" containerName="barbican-api-log" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.607297 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.609850 4883 scope.go:117] "RemoveContainer" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.613780 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.614779 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4s72j" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.616249 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.616443 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.622377 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.624960 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680224 4883 scope.go:117] "RemoveContainer" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.680777 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": container with ID starting with 7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c not found: ID does not exist" containerID="7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680818 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c"} err="failed to get container status \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": rpc error: code = NotFound desc = could not find container \"7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c\": container with ID starting with 7c3787220fac228a721b078c0af78f8ccf55ccc30a5c48c03de9de7632f7104c not found: ID does not exist" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.680843 4883 scope.go:117] "RemoveContainer" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: E0310 09:23:19.681090 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": container with ID starting with 4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1 not found: ID does not exist" containerID="4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.681125 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1"} err="failed to get container status \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": rpc error: code = NotFound desc = could not find container \"4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1\": container with ID starting with 4d94c80485b1e4298de0d40abeaf1c471bb788f2c486bfb51ccc5ea376df36c1 not found: ID does not exist" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.681146 4883 scope.go:117] "RemoveContainer" containerID="368d8e92de1f4ea2c761213a4a664cf7368f309a09685b643b23b733208da7ea" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.701334 4883 scope.go:117] "RemoveContainer" containerID="e9f3a5dd63ead621338a9597451694f38d0bb7781f2624ce640a6d0a6dc7e4a2" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712660 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712686 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712730 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.712926 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.746181 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.753811 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-857bd77984-4wnsb"] Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815601 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815822 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.815845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.816907 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/309c3af5-db30-48b8-8118-471950b7312c-logs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.824669 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-scripts\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.824758 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-config-data\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.825152 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-internal-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.827741 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-combined-ca-bundle\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.828999 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/309c3af5-db30-48b8-8118-471950b7312c-public-tls-certs\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.835374 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wczn\" (UniqueName: \"kubernetes.io/projected/309c3af5-db30-48b8-8118-471950b7312c-kube-api-access-4wczn\") pod \"placement-5946656968-5mzlm\" (UID: \"309c3af5-db30-48b8-8118-471950b7312c\") " pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:19 crc kubenswrapper[4883]: I0310 09:23:19.962910 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.095985 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13211306-6813-40e0-91c2-5f8ac43968c6" path="/var/lib/kubelet/pods/13211306-6813-40e0-91c2-5f8ac43968c6/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.096921 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499" path="/var/lib/kubelet/pods/1cfa9d4a-7ad8-4e8f-8bb3-df5271a96499/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.097813 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329b0d09-6aa9-427f-8b4b-209bb2c6707b" path="/var/lib/kubelet/pods/329b0d09-6aa9-427f-8b4b-209bb2c6707b/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.099078 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627e20df-c100-4dac-b344-efda8eda195a" path="/var/lib/kubelet/pods/627e20df-c100-4dac-b344-efda8eda195a/volumes" Mar 10 09:23:20 crc kubenswrapper[4883]: W0310 09:23:20.391697 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309c3af5_db30_48b8_8118_471950b7312c.slice/crio-b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001 WatchSource:0}: Error finding container b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001: Status 404 returned error can't find the container with id b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.393150 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5946656968-5mzlm"] Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443414 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" exitCode=0 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443446 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" exitCode=2 Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443519 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21"} Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.443570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b"} Mar 10 09:23:20 crc kubenswrapper[4883]: I0310 09:23:20.444798 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"b61d7b6065c243f6373050e2a8de83e69a1990dc6e4913c4f3260a337aeab001"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.459463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"e79b3bbae1ef491e3eb1cf5d5527c1a8f3f25ca7b001953b719564deb9bcc0f4"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.459796 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5946656968-5mzlm" event={"ID":"309c3af5-db30-48b8-8118-471950b7312c","Type":"ContainerStarted","Data":"49ea9b54ba2a9c9f6832dee8075a00229502735ff96263c0039328cb99965844"} Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.460085 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.460109 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:21 crc kubenswrapper[4883]: I0310 09:23:21.489243 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5946656968-5mzlm" podStartSLOduration=2.489224986 podStartE2EDuration="2.489224986s" podCreationTimestamp="2026-03-10 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:21.48016418 +0000 UTC m=+1187.735062069" watchObservedRunningTime="2026-03-10 09:23:21.489224986 +0000 UTC m=+1187.744122875" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.471189 4883 generic.go:334] "Generic (PLEG): container finished" podID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerID="859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" exitCode=0 Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.471267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36"} Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.473359 4883 generic.go:334] "Generic (PLEG): container finished" podID="dc0b1d9d-7834-473a-a487-6f540c606706" containerID="0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf" exitCode=0 Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.473466 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerDied","Data":"0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf"} Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.772087 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788469 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788577 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788630 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788701 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788911 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.788975 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.789005 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") pod \"87232af6-dc87-4f68-8b1f-850fd98219a8\" (UID: \"87232af6-dc87-4f68-8b1f-850fd98219a8\") " Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.790229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.794444 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.820637 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts" (OuterVolumeSpecName: "scripts") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.824222 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9" (OuterVolumeSpecName: "kube-api-access-wmmc9") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "kube-api-access-wmmc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.836828 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.849465 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.855739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data" (OuterVolumeSpecName: "config-data") pod "87232af6-dc87-4f68-8b1f-850fd98219a8" (UID: "87232af6-dc87-4f68-8b1f-850fd98219a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.890933 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892300 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892330 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892341 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892351 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892361 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87232af6-dc87-4f68-8b1f-850fd98219a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892369 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87232af6-dc87-4f68-8b1f-850fd98219a8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.892378 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmmc9\" (UniqueName: \"kubernetes.io/projected/87232af6-dc87-4f68-8b1f-850fd98219a8-kube-api-access-wmmc9\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:22 crc kubenswrapper[4883]: I0310 09:23:22.994936 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.099574 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.145889 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b55764b68-l794s" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218121 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218567 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b5b6f75db-kgz55" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" containerID="cri-o://f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" gracePeriod=30 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.218685 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b5b6f75db-kgz55" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" containerID="cri-o://7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" gracePeriod=30 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486064 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87232af6-dc87-4f68-8b1f-850fd98219a8","Type":"ContainerDied","Data":"340a626f298a9ef4ea8b124b00075b5b0663514ede9873d187c8ba97a25a89bf"} Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486135 4883 scope.go:117] "RemoveContainer" containerID="82ac9cef449a166509d1caa350f74b5be107da82b10ce98d86e5e716e762ff21" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.486129 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.489441 4883 generic.go:334] "Generic (PLEG): container finished" podID="41ac116f-f773-4b3f-a508-bc304668da18" containerID="f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" exitCode=143 Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.490674 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a"} Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.513291 4883 scope.go:117] "RemoveContainer" containerID="68f801eebbf66b42a914bc4f687373317ee14f9ee6dba99ea4dc4fb9a1f0310b" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.540373 4883 scope.go:117] "RemoveContainer" containerID="859a5bbeb1589e12771278a11c1722a6122ae18255ec7752da9e54dc359a7b36" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.556798 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.566160 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.576748 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577130 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577148 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577172 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577179 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.577193 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577200 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577359 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="sg-core" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577373 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="ceilometer-notification-agent" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.577381 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" containerName="proxy-httpd" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.578843 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.581622 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.581823 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.582895 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605106 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605145 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605249 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605303 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605320 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605379 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.605671 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.693155 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:23 crc kubenswrapper[4883]: E0310 09:23:23.694139 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-ghhwq log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6c96038a-8a42-4863-a89f-5076c24da12e" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707742 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707953 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.707973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.708022 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.708603 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.709203 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.719765 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.720530 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.721191 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.721677 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.724735 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"ceilometer-0\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " pod="openstack/ceilometer-0" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.811173 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913301 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913347 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913515 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.913796 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") pod \"dc0b1d9d-7834-473a-a487-6f540c606706\" (UID: \"dc0b1d9d-7834-473a-a487-6f540c606706\") " Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.914520 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.922552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.932088 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts" (OuterVolumeSpecName: "scripts") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.932201 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm" (OuterVolumeSpecName: "kube-api-access-6lhjm") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "kube-api-access-6lhjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.935112 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:23 crc kubenswrapper[4883]: I0310 09:23:23.953838 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data" (OuterVolumeSpecName: "config-data") pod "dc0b1d9d-7834-473a-a487-6f540c606706" (UID: "dc0b1d9d-7834-473a-a487-6f540c606706"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016072 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016101 4883 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016123 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lhjm\" (UniqueName: \"kubernetes.io/projected/dc0b1d9d-7834-473a-a487-6f540c606706-kube-api-access-6lhjm\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016137 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016146 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc0b1d9d-7834-473a-a487-6f540c606706-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.016155 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0b1d9d-7834-473a-a487-6f540c606706-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.089789 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87232af6-dc87-4f68-8b1f-850fd98219a8" path="/var/lib/kubelet/pods/87232af6-dc87-4f68-8b1f-850fd98219a8/volumes" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.504797 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.504945 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-x2hf5" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.505887 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-x2hf5" event={"ID":"dc0b1d9d-7834-473a-a487-6f540c606706","Type":"ContainerDied","Data":"bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc"} Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.505966 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6a779566a1daa68a4f86119c6780e52d4493f0681838af48cbb4dbd90f52cc" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.539080 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.550041 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.552798 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-555c96ddb-t7tcm" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.677143 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755612 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755758 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.755908 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.756013 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.756098 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") pod \"6c96038a-8a42-4863-a89f-5076c24da12e\" (UID: \"6c96038a-8a42-4863-a89f-5076c24da12e\") " Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.765822 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.766509 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.767768 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.768652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq" (OuterVolumeSpecName: "kube-api-access-ghhwq") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "kube-api-access-ghhwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.778602 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts" (OuterVolumeSpecName: "scripts") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.787946 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data" (OuterVolumeSpecName: "config-data") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.789093 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c96038a-8a42-4863-a89f-5076c24da12e" (UID: "6c96038a-8a42-4863-a89f-5076c24da12e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795103 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:24 crc kubenswrapper[4883]: E0310 09:23:24.795436 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795453 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.795649 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" containerName="cinder-db-sync" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.796464 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805655 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-prwrq" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805774 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.805889 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.812546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859041 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghhwq\" (UniqueName: \"kubernetes.io/projected/6c96038a-8a42-4863-a89f-5076c24da12e-kube-api-access-ghhwq\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859069 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859080 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859095 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859104 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c96038a-8a42-4863-a89f-5076c24da12e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859112 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.859120 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c96038a-8a42-4863-a89f-5076c24da12e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.878020 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.879755 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.902004 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960814 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960837 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960858 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:24 crc kubenswrapper[4883]: I0310 09:23:24.960962 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062498 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062785 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.062905 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063015 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063101 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063298 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063377 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063454 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063651 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.063762 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.065192 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.070041 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.073934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.077623 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.091435 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.100412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"cinder-scheduler-0\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.111690 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.113313 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.117289 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.124682 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.164185 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165840 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165887 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.165939 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.166015 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.166094 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.167817 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.171075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.185553 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.194678 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.195641 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.204725 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"dnsmasq-dns-7b8fcc65cc-mnnl8\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.222663 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.268618 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.268933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269043 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269181 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269300 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.269497 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.270408 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.373729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374032 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374301 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374383 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374484 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374573 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.374718 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.380074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.384401 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.384830 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.388843 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.391171 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"cinder-api-0\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.500666 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512394 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512539 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" containerID="cri-o://5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.512825 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" containerID="cri-o://ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.522001 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612201 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612509 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" containerID="cri-o://3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.612959 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" containerID="cri-o://6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" gracePeriod=30 Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.655266 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": EOF" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.674027 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.687928 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.707568 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.709385 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.720239 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.730643 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.733042 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.736361 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.737573 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.751075 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.762500 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885880 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885900 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.885935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886315 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886545 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886619 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886756 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886803 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.886862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.988979 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989029 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989084 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989212 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989274 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989352 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989449 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989486 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989641 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.989785 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990466 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990569 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.990575 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997060 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997129 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.997719 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-internal-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998130 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-httpd-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998230 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-config\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:25 crc kubenswrapper[4883]: I0310 09:23:25.998440 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-combined-ca-bundle\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.000814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-ovndb-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.002836 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.004667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"ceilometer-0\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.006240 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ss9n\" (UniqueName: \"kubernetes.io/projected/82fb8a17-1c35-415a-8a5d-478730286eb1-kube-api-access-9ss9n\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.008214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/82fb8a17-1c35-415a-8a5d-478730286eb1-public-tls-certs\") pod \"neutron-7b5fb6fc5c-pj985\" (UID: \"82fb8a17-1c35-415a-8a5d-478730286eb1\") " pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.033195 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.095757 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.106375 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c96038a-8a42-4863-a89f-5076c24da12e" path="/var/lib/kubelet/pods/6c96038a-8a42-4863-a89f-5076c24da12e/volumes" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.107252 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.182395 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.522849 4883 generic.go:334] "Generic (PLEG): container finished" podID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.522907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.526829 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"987bc0cc38389c3b6190190ae5c16e3637e3022a5ee37c4c3bc24573be51664c"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.539178 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"3b75442ea4aee0774586b8d43c5c36b051132ecdb5e3320c19060d614e90bf9c"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.547680 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.553850 4883 generic.go:334] "Generic (PLEG): container finished" podID="41ac116f-f773-4b3f-a508-bc304668da18" containerID="7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.553908 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556338 4883 generic.go:334] "Generic (PLEG): container finished" podID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerID="8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72" exitCode=0 Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556359 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72"} Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.556376 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerStarted","Data":"f3dfd9c8abe53e2f4e70fd004e6457ef025d9a2c819617d0dfac05e54db79843"} Mar 10 09:23:26 crc kubenswrapper[4883]: W0310 09:23:26.565395 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5bf577_e25f_4df2_b088_d1b667ea1d0e.slice/crio-b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca WatchSource:0}: Error finding container b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca: Status 404 returned error can't find the container with id b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.690177 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b5fb6fc5c-pj985"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.759148 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919273 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919442 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919491 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919647 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919744 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.919853 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.920166 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs" (OuterVolumeSpecName: "logs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.920497 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ac116f-f773-4b3f-a508-bc304668da18-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.928252 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.929214 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m" (OuterVolumeSpecName: "kube-api-access-4gb9m") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "kube-api-access-4gb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.943204 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5bc48b486f-2j899" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.952872 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.959885 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:26 crc kubenswrapper[4883]: I0310 09:23:26.994081 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.004991 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.022986 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data" (OuterVolumeSpecName: "config-data") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023073 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") pod \"41ac116f-f773-4b3f-a508-bc304668da18\" (UID: \"41ac116f-f773-4b3f-a508-bc304668da18\") " Mar 10 09:23:27 crc kubenswrapper[4883]: W0310 09:23:27.023305 4883 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/41ac116f-f773-4b3f-a508-bc304668da18/volumes/kubernetes.io~secret/config-data Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data" (OuterVolumeSpecName: "config-data") pod "41ac116f-f773-4b3f-a508-bc304668da18" (UID: "41ac116f-f773-4b3f-a508-bc304668da18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023789 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gb9m\" (UniqueName: \"kubernetes.io/projected/41ac116f-f773-4b3f-a508-bc304668da18-kube-api-access-4gb9m\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023814 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023825 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023837 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023848 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.023861 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41ac116f-f773-4b3f-a508-bc304668da18-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.579737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerStarted","Data":"cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.580217 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584277 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"15fb2cc42b063a1215930cdffeb827d10df153ae99d5d786342702e21c8dc077"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584306 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"048ce7d69969a131d91672c025bd83c18be75fbcfc3649fc4a94970ff74e3a77"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.584319 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b5fb6fc5c-pj985" event={"ID":"82fb8a17-1c35-415a-8a5d-478730286eb1","Type":"ContainerStarted","Data":"bd937ca1a6b53d82a1756c79df23dd1772ef704b29ba04e5d1cc890f7ab8778d"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.585552 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.599759 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" podStartSLOduration=3.59974666 podStartE2EDuration="3.59974666s" podCreationTimestamp="2026-03-10 09:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.59827764 +0000 UTC m=+1193.853175528" watchObservedRunningTime="2026-03-10 09:23:27.59974666 +0000 UTC m=+1193.854644549" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609643 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerStarted","Data":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609644 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" containerID="cri-o://0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" gracePeriod=30 Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609702 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.609732 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" containerID="cri-o://4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" gracePeriod=30 Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.621349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.622581 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b5fb6fc5c-pj985" podStartSLOduration=2.622562604 podStartE2EDuration="2.622562604s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.615058012 +0000 UTC m=+1193.869955901" watchObservedRunningTime="2026-03-10 09:23:27.622562604 +0000 UTC m=+1193.877460493" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.625210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.625253 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629525 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b5b6f75db-kgz55" event={"ID":"41ac116f-f773-4b3f-a508-bc304668da18","Type":"ContainerDied","Data":"867dc81634d3cab6b7d62fe7b7bea4736f522c76e9c10f3a681cf74c0480a8db"} Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629640 4883 scope.go:117] "RemoveContainer" containerID="7bfade1d342fc1fec145c040e3bb1b0fda95a95f92ce0479cba528ed21f45abc" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.629813 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b5b6f75db-kgz55" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.644256 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.644234109 podStartE2EDuration="2.644234109s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:27.634715811 +0000 UTC m=+1193.889613700" watchObservedRunningTime="2026-03-10 09:23:27.644234109 +0000 UTC m=+1193.899131999" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.663325 4883 scope.go:117] "RemoveContainer" containerID="f44137001e9de9273d72347a19e670685324d63a7ae12a0a99d5b0aa11da1c3a" Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.668644 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:27 crc kubenswrapper[4883]: I0310 09:23:27.674072 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-b5b6f75db-kgz55"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.095053 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ac116f-f773-4b3f-a508-bc304668da18" path="/var/lib/kubelet/pods/41ac116f-f773-4b3f-a508-bc304668da18/volumes" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.151980 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152780 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152844 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152914 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.152931 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.224554 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.227779 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.244703 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.247552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255293 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255458 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.255711 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") pod \"77895d16-8ad3-4edb-ae91-d807afd499b3\" (UID: \"77895d16-8ad3-4edb-ae91-d807afd499b3\") " Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.256934 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257045 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257104 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.257154 4883 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.258675 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk" (OuterVolumeSpecName: "kube-api-access-xj9nk") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "kube-api-access-xj9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.258821 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.290006 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config" (OuterVolumeSpecName: "config") pod "77895d16-8ad3-4edb-ae91-d807afd499b3" (UID: "77895d16-8ad3-4edb-ae91-d807afd499b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358889 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj9nk\" (UniqueName: \"kubernetes.io/projected/77895d16-8ad3-4edb-ae91-d807afd499b3-kube-api-access-xj9nk\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358914 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.358924 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/77895d16-8ad3-4edb-ae91-d807afd499b3-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642631 4883 generic.go:334] "Generic (PLEG): container finished" podID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" exitCode=0 Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642696 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bc48b486f-2j899" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.642733 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.643611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bc48b486f-2j899" event={"ID":"77895d16-8ad3-4edb-ae91-d807afd499b3","Type":"ContainerDied","Data":"3e2a2bca9130ffcd91c88897030c731f9414ca6f4f8e578189201420e38446dd"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.643638 4883 scope.go:117] "RemoveContainer" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.648230 4883 generic.go:334] "Generic (PLEG): container finished" podID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" exitCode=143 Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.648315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.650810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerStarted","Data":"95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.666197 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.672604 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.586911066 podStartE2EDuration="4.672578665s" podCreationTimestamp="2026-03-10 09:23:24 +0000 UTC" firstStartedPulling="2026-03-10 09:23:25.734236781 +0000 UTC m=+1191.989134671" lastFinishedPulling="2026-03-10 09:23:26.819904381 +0000 UTC m=+1193.074802270" observedRunningTime="2026-03-10 09:23:28.671185999 +0000 UTC m=+1194.926083888" watchObservedRunningTime="2026-03-10 09:23:28.672578665 +0000 UTC m=+1194.927476554" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.678223 4883 scope.go:117] "RemoveContainer" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.698556 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.720495 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bc48b486f-2j899"] Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.723531 4883 scope.go:117] "RemoveContainer" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: E0310 09:23:28.724293 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": container with ID starting with 6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a not found: ID does not exist" containerID="6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.724326 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a"} err="failed to get container status \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": rpc error: code = NotFound desc = could not find container \"6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a\": container with ID starting with 6e4bff320a693f1795e114771e40e2c1e356707f01eaf5055830385ef8ea600a not found: ID does not exist" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.724349 4883 scope.go:117] "RemoveContainer" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: E0310 09:23:28.725846 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": container with ID starting with 3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49 not found: ID does not exist" containerID="3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49" Mar 10 09:23:28 crc kubenswrapper[4883]: I0310 09:23:28.725919 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49"} err="failed to get container status \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": rpc error: code = NotFound desc = could not find container \"3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49\": container with ID starting with 3516d32f0bc581f4f1393801532efca1680f60d3cba44203e1532496ead6ef49 not found: ID does not exist" Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.708212 4883 generic.go:334] "Generic (PLEG): container finished" podID="a4909549-f2c4-45b0-a8f8-521302991297" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" exitCode=0 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.708299 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.785123 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787326 4883 generic.go:334] "Generic (PLEG): container finished" podID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerID="0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" exitCode=137 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787355 4883 generic.go:334] "Generic (PLEG): container finished" podID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerID="c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" exitCode=137 Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787381 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424"} Mar 10 09:23:29 crc kubenswrapper[4883]: I0310 09:23:29.787464 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9"} Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.057435 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.092585 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" path="/var/lib/kubelet/pods/77895d16-8ad3-4edb-ae91-d807afd499b3/volumes" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.102982 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.103000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.103026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") pod \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\" (UID: \"dedd724b-83f5-408a-b4d5-08adb5d71cc0\") " Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.104077 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs" (OuterVolumeSpecName: "logs") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.108887 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.114418 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr" (OuterVolumeSpecName: "kube-api-access-fxrwr") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "kube-api-access-fxrwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.125339 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data" (OuterVolumeSpecName: "config-data") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.126330 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts" (OuterVolumeSpecName: "scripts") pod "dedd724b-83f5-408a-b4d5-08adb5d71cc0" (UID: "dedd724b-83f5-408a-b4d5-08adb5d71cc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.165143 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204543 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dedd724b-83f5-408a-b4d5-08adb5d71cc0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204643 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204701 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dedd724b-83f5-408a-b4d5-08adb5d71cc0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204749 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dedd724b-83f5-408a-b4d5-08adb5d71cc0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.204814 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrwr\" (UniqueName: \"kubernetes.io/projected/dedd724b-83f5-408a-b4d5-08adb5d71cc0-kube-api-access-fxrwr\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.799246 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-798c4d5785-ftwkg" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.800142 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-798c4d5785-ftwkg" event={"ID":"dedd724b-83f5-408a-b4d5-08adb5d71cc0","Type":"ContainerDied","Data":"b811a98b9eb641299a8060183de1afd8f605b6eb8c5e07f91e568070217a7cad"} Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.800295 4883 scope.go:117] "RemoveContainer" containerID="0e86e22acb39b181cf412775b1b1f806082c86689f0257c2525c32d95cd32424" Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.932189 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:23:30 crc kubenswrapper[4883]: I0310 09:23:30.938523 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-798c4d5785-ftwkg"] Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.032777 4883 scope.go:117] "RemoveContainer" containerID="c9fb87caf677ab775fd09fb5b4c1268f29ff2c553286a49e90875b8415d4d3a9" Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.262643 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.810569 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerStarted","Data":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} Mar 10 09:23:31 crc kubenswrapper[4883]: I0310 09:23:31.836052 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.654828327 podStartE2EDuration="6.836031279s" podCreationTimestamp="2026-03-10 09:23:25 +0000 UTC" firstStartedPulling="2026-03-10 09:23:26.586767769 +0000 UTC m=+1192.841665657" lastFinishedPulling="2026-03-10 09:23:30.76797072 +0000 UTC m=+1197.022868609" observedRunningTime="2026-03-10 09:23:31.832289743 +0000 UTC m=+1198.087187632" watchObservedRunningTime="2026-03-10 09:23:31.836031279 +0000 UTC m=+1198.090929167" Mar 10 09:23:32 crc kubenswrapper[4883]: I0310 09:23:32.091200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" path="/var/lib/kubelet/pods/dedd724b-83f5-408a-b4d5-08adb5d71cc0/volumes" Mar 10 09:23:32 crc kubenswrapper[4883]: I0310 09:23:32.820765 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.407203 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.463609 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.503216 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.580156 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.580569 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" containerID="cri-o://25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" gracePeriod=10 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862378 4883 generic.go:334] "Generic (PLEG): container finished" podID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerID="25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" exitCode=0 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2"} Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862757 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" containerID="cri-o://29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" gracePeriod=30 Mar 10 09:23:35 crc kubenswrapper[4883]: I0310 09:23:35.862983 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" containerID="cri-o://95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" gracePeriod=30 Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.089073 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217068 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217191 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217270 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217290 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.217366 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") pod \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\" (UID: \"ce11c091-db9b-47fb-8427-dcaa2585a4c7\") " Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.223004 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g" (OuterVolumeSpecName: "kube-api-access-pmx5g") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "kube-api-access-pmx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.252746 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.255288 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.256033 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.261869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.265010 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config" (OuterVolumeSpecName: "config") pod "ce11c091-db9b-47fb-8427-dcaa2585a4c7" (UID: "ce11c091-db9b-47fb-8427-dcaa2585a4c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321259 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmx5g\" (UniqueName: \"kubernetes.io/projected/ce11c091-db9b-47fb-8427-dcaa2585a4c7-kube-api-access-pmx5g\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321288 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321299 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321310 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321321 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.321330 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce11c091-db9b-47fb-8427-dcaa2585a4c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.873150 4883 generic.go:334] "Generic (PLEG): container finished" podID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerID="95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" exitCode=0 Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.873274 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136"} Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875661 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" event={"ID":"ce11c091-db9b-47fb-8427-dcaa2585a4c7","Type":"ContainerDied","Data":"f2db0f853d6e758fe5cee704d792086fd294dc16742d1ad01bde933a91c3bcaa"} Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875721 4883 scope.go:117] "RemoveContainer" containerID="25d1184c834395ecc1ce637374da9092da7e27ecf8a8400d81776463506396e2" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.875904 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-tgr6b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.900009 4883 scope.go:117] "RemoveContainer" containerID="611c5c4793d5a469fa00a11e611df52aec3fea84115f5f16327469e87284b34b" Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.919170 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:36 crc kubenswrapper[4883]: I0310 09:23:36.926632 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-tgr6b"] Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.183930 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.886719 4883 generic.go:334] "Generic (PLEG): container finished" podID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerID="29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" exitCode=0 Mar 10 09:23:37 crc kubenswrapper[4883]: I0310 09:23:37.886998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7"} Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.087843 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.092079 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" path="/var/lib/kubelet/pods/ce11c091-db9b-47fb-8427-dcaa2585a4c7/volumes" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155833 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155871 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155907 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.155934 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156010 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156030 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") pod \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\" (UID: \"0a4664a7-ad8d-44ab-8f7f-d621e6b01899\") " Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.156435 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts" (OuterVolumeSpecName: "scripts") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf" (OuterVolumeSpecName: "kube-api-access-d4sgf") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "kube-api-access-d4sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.161895 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.192536 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.231874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data" (OuterVolumeSpecName: "config-data") pod "0a4664a7-ad8d-44ab-8f7f-d621e6b01899" (UID: "0a4664a7-ad8d-44ab-8f7f-d621e6b01899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259394 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4sgf\" (UniqueName: \"kubernetes.io/projected/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-kube-api-access-d4sgf\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259433 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259445 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259454 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.259465 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4664a7-ad8d-44ab-8f7f-d621e6b01899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a4664a7-ad8d-44ab-8f7f-d621e6b01899","Type":"ContainerDied","Data":"3b75442ea4aee0774586b8d43c5c36b051132ecdb5e3320c19060d614e90bf9c"} Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896786 4883 scope.go:117] "RemoveContainer" containerID="95a8cd9db52de24a092ef958c5d1f9e08fab7d8af23ccfe700527e534ffb3136" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.896784 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.913728 4883 scope.go:117] "RemoveContainer" containerID="29e17f1955d24da31f1ef855d53d08193f96fe7c1aaf64e5d0776ebe40d1f3d7" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.922588 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.928006 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940506 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940877 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940896 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940917 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940930 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940936 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940955 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940961 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940975 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940980 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.940988 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.940993 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941001 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941008 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941017 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941023 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941034 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="init" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941039 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="init" Mar 10 09:23:38 crc kubenswrapper[4883]: E0310 09:23:38.941046 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941051 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941202 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941216 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce11c091-db9b-47fb-8427-dcaa2585a4c7" containerName="dnsmasq-dns" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941223 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="cinder-scheduler" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941233 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941242 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-httpd" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" containerName="probe" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941256 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedd724b-83f5-408a-b4d5-08adb5d71cc0" containerName="horizon" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941271 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="77895d16-8ad3-4edb-ae91-d807afd499b3" containerName="neutron-api" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.941281 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ac116f-f773-4b3f-a508-bc304668da18" containerName="barbican-api-log" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.942182 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.943712 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 09:23:38 crc kubenswrapper[4883]: I0310 09:23:38.953372 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079861 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079899 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.079980 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080044 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080160 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.080192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.181863 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.181932 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182078 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182358 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7bae0a1-9bb8-47ba-a161-764cd7406992-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.182967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.186902 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.187210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.188058 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.188179 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bae0a1-9bb8-47ba-a161-764cd7406992-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.197911 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btv4b\" (UniqueName: \"kubernetes.io/projected/a7bae0a1-9bb8-47ba-a161-764cd7406992-kube-api-access-btv4b\") pod \"cinder-scheduler-0\" (UID: \"a7bae0a1-9bb8-47ba-a161-764cd7406992\") " pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.258277 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.657849 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 09:23:39 crc kubenswrapper[4883]: W0310 09:23:39.661611 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7bae0a1_9bb8_47ba_a161_764cd7406992.slice/crio-658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a WatchSource:0}: Error finding container 658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a: Status 404 returned error can't find the container with id 658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a Mar 10 09:23:39 crc kubenswrapper[4883]: I0310 09:23:39.909758 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"658adbd61df5c334c9198f35c7aa39ec91f5a9e8aca298d8495188a80fe42f9a"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.097321 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4664a7-ad8d-44ab-8f7f-d621e6b01899" path="/var/lib/kubelet/pods/0a4664a7-ad8d-44ab-8f7f-d621e6b01899/volumes" Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.864392 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-744f4576f6-kglt9" Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.923237 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"05192ac9b45aeed45a69a1553b5262897db22a4255d969c7aec06c33ce478cce"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.923284 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7bae0a1-9bb8-47ba-a161-764cd7406992","Type":"ContainerStarted","Data":"66ce932d5a4da7e11095bb03edb1c911dda0fe72e21487c190608ef140a2dae6"} Mar 10 09:23:40 crc kubenswrapper[4883]: I0310 09:23:40.945031 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.945009183 podStartE2EDuration="2.945009183s" podCreationTimestamp="2026-03-10 09:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:40.941582752 +0000 UTC m=+1207.196480642" watchObservedRunningTime="2026-03-10 09:23:40.945009183 +0000 UTC m=+1207.199907073" Mar 10 09:23:41 crc kubenswrapper[4883]: I0310 09:23:41.261786 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.259194 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.499293 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.501807 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.505879 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506366 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506632 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.506770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.509722 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510797 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510894 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.510916 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-642mt" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608462 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.608583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.612100 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.615748 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-openstack-config-secret\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.615938 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166b0c95-d44f-41e4-b27a-01e549dfb9d2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.624864 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqrd\" (UniqueName: \"kubernetes.io/projected/166b0c95-d44f-41e4-b27a-01e549dfb9d2-kube-api-access-qmqrd\") pod \"openstackclient\" (UID: \"166b0c95-d44f-41e4-b27a-01e549dfb9d2\") " pod="openstack/openstackclient" Mar 10 09:23:44 crc kubenswrapper[4883]: I0310 09:23:44.822081 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 09:23:45 crc kubenswrapper[4883]: I0310 09:23:45.243068 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 09:23:45 crc kubenswrapper[4883]: I0310 09:23:45.966177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"166b0c95-d44f-41e4-b27a-01e549dfb9d2","Type":"ContainerStarted","Data":"e27aa793a515e22a3bd9f4f5ac0b1c0c27191df3f76a0a58fdc118ca2a860551"} Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449076 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449401 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.449456 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.450270 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:23:47 crc kubenswrapper[4883]: I0310 09:23:47.450324 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" gracePeriod=600 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.001996 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" exitCode=0 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002163 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3"} Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.002496 4883 scope.go:117] "RemoveContainer" containerID="7b763722fafe0669f89812126b4b9a1c6fdb578c65fcf3519f299d35b8f8264e" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.102135 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.104278 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.104380 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.106924 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.107406 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.112715 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206308 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206486 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206684 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.206963 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.207035 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.207136 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.230182 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.230534 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" containerID="cri-o://c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231447 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" containerID="cri-o://c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231552 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" containerID="cri-o://77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.231629 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" containerID="cri-o://ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" gracePeriod=30 Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.249977 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": EOF" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.308960 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309108 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309133 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309177 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309224 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309254 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309270 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.309378 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.310196 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-run-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.314930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-log-httpd\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.321205 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-internal-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.322884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-public-tls-certs\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.324560 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-combined-ca-bundle\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.327966 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-etc-swift\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.328720 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55nn\" (UniqueName: \"kubernetes.io/projected/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-kube-api-access-j55nn\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.339686 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b-config-data\") pod \"swift-proxy-6656f7cc-nv5pp\" (UID: \"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b\") " pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.432271 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.926578 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:48 crc kubenswrapper[4883]: I0310 09:23:48.941443 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6656f7cc-nv5pp"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.014294 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"613e9b6262e6592360fde16de67b1c0eafb2ca168888dae279806bfa2fb4a2a8"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017808 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017833 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" exitCode=2 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017842 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017849 4883 generic.go:334] "Generic (PLEG): container finished" podID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" exitCode=0 Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017886 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017926 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017941 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa5bf577-e25f-4df2-b088-d1b667ea1d0e","Type":"ContainerDied","Data":"b152cd34b224458fb1ee138c83877e4d16208f40c78470f8afbb79dee1fb3eca"} Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.017956 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.018695 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034555 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034658 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034749 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034785 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034810 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.034830 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") pod \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\" (UID: \"aa5bf577-e25f-4df2-b088-d1b667ea1d0e\") " Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.038690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.038914 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.040572 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts" (OuterVolumeSpecName: "scripts") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.042086 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82" (OuterVolumeSpecName: "kube-api-access-4xj82") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "kube-api-access-4xj82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.045145 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.075578 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.075587 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.117377 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137427 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xj82\" (UniqueName: \"kubernetes.io/projected/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-kube-api-access-4xj82\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137453 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137464 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137503 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137513 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.137522 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.155299 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.218314 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220054 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data" (OuterVolumeSpecName: "config-data") pod "aa5bf577-e25f-4df2-b088-d1b667ea1d0e" (UID: "aa5bf577-e25f-4df2-b088-d1b667ea1d0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.220227 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220251 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220278 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.220575 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220718 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.220737 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.221006 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221023 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221048 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.221436 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221452 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221526 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.221789 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222043 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222065 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222314 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222337 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222600 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.222622 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223016 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223031 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223431 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223445 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.223991 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224006 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224438 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224451 4883 scope.go:117] "RemoveContainer" containerID="c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224858 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73"} err="failed to get container status \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": rpc error: code = NotFound desc = could not find container \"c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73\": container with ID starting with c5b556bd67d098985d595293ab2817163dff941d98c07d1bd74501d36fa08f73 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.224871 4883 scope.go:117] "RemoveContainer" containerID="77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225213 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6"} err="failed to get container status \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": rpc error: code = NotFound desc = could not find container \"77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6\": container with ID starting with 77339c50ceb4e8189dd4e3dab67e264fce09d0920656bb447aa4b16477c79ae6 not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225226 4883 scope.go:117] "RemoveContainer" containerID="ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225714 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c"} err="failed to get container status \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": rpc error: code = NotFound desc = could not find container \"ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c\": container with ID starting with ea796fe946dd6de7a3bfe27ded8c282e888f7a330cd0a6a952fe30af387f556c not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.225732 4883 scope.go:117] "RemoveContainer" containerID="c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.226402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de"} err="failed to get container status \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": rpc error: code = NotFound desc = could not find container \"c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de\": container with ID starting with c568e671e1483c0fe8ca2903003e1c513ad64a8df95a2524f673e4547bcd14de not found: ID does not exist" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.238665 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5bf577-e25f-4df2-b088-d1b667ea1d0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.355674 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.363568 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384281 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384719 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384741 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384755 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384761 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384779 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384785 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: E0310 09:23:49.384804 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384809 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.384987 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="sg-core" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385012 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-notification-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385021 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="proxy-httpd" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.385036 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" containerName="ceilometer-central-agent" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.387040 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.389373 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.395379 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.404091 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441591 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441844 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441907 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.441929 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.541448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543628 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543752 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.543774 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.544139 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.544199 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.548259 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.547524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.552096 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.552158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.564075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"ceilometer-0\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " pod="openstack/ceilometer-0" Mar 10 09:23:49 crc kubenswrapper[4883]: I0310 09:23:49.712291 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.045231 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"0ea21f69372c054943b2a3fa43db7fbcdd35a3e836454d8db14baa2b8075e60b"} Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.045298 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6656f7cc-nv5pp" event={"ID":"ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b","Type":"ContainerStarted","Data":"de0c2d43864189c52c0b910f86838a18fa45d493ba35f8abee881a9ffb22760b"} Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.046818 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.046855 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.070409 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6656f7cc-nv5pp" podStartSLOduration=2.070388996 podStartE2EDuration="2.070388996s" podCreationTimestamp="2026-03-10 09:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:50.068272867 +0000 UTC m=+1216.323170746" watchObservedRunningTime="2026-03-10 09:23:50.070388996 +0000 UTC m=+1216.325286886" Mar 10 09:23:50 crc kubenswrapper[4883]: I0310 09:23:50.091180 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5bf577-e25f-4df2-b088-d1b667ea1d0e" path="/var/lib/kubelet/pods/aa5bf577-e25f-4df2-b088-d1b667ea1d0e/volumes" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.040461 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.042219 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5946656968-5mzlm" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.261662 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5fcc9bbb48-lf4jb" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 09:23:51 crc kubenswrapper[4883]: I0310 09:23:51.262035 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:54 crc kubenswrapper[4883]: I0310 09:23:54.961248 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027193 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027673 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" containerID="cri-o://9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.027828 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" containerID="cri-o://a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.110391 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"166b0c95-d44f-41e4-b27a-01e549dfb9d2","Type":"ContainerStarted","Data":"a5721b4d9c7d6d5ef64c930764160a5476875b9aa466203bac010d9ce58c29b4"} Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.111673 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"8662768bfd83f0fff77bedd9babc38ac435600a470cdeb93306da3bcece7d468"} Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.132850 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.812426871 podStartE2EDuration="11.132838023s" podCreationTimestamp="2026-03-10 09:23:44 +0000 UTC" firstStartedPulling="2026-03-10 09:23:45.254642995 +0000 UTC m=+1211.509540884" lastFinishedPulling="2026-03-10 09:23:54.575054156 +0000 UTC m=+1220.829952036" observedRunningTime="2026-03-10 09:23:55.123691525 +0000 UTC m=+1221.378589404" watchObservedRunningTime="2026-03-10 09:23:55.132838023 +0000 UTC m=+1221.387735913" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.497747 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.498835 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.510729 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.601254 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.602721 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.605937 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.606000 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.613208 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.616180 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.618102 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.624401 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.637325 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708306 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708353 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708381 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708418 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708443 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.708529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.709391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.709445 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.710659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.717630 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.729087 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"nova-api-db-create-4vxd6\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811264 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811355 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811452 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.811613 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.813103 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.814285 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.815052 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.830388 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.831719 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.834051 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.844643 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.845027 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"nova-api-f74b-account-create-update-lsxls\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.853124 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"nova-cell0-db-create-l9ldx\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.880258 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.880859 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" containerID="cri-o://feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.881387 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" containerID="cri-o://93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" gracePeriod=30 Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913496 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.913701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.914593 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.945963 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"nova-cell1-db-create-zr486\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.959701 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.964021 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:55 crc kubenswrapper[4883]: I0310 09:23:55.992776 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015298 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015451 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015496 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015551 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015666 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.015797 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") pod \"a4909549-f2c4-45b0-a8f8-521302991297\" (UID: \"a4909549-f2c4-45b0-a8f8-521302991297\") " Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.016283 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.016423 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.022961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.025316 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs" (OuterVolumeSpecName: "logs") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.027560 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm" (OuterVolumeSpecName: "kube-api-access-scthm") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "kube-api-access-scthm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.038229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.074901 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.111057 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"nova-cell0-15d6-account-create-update-lwjcj\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168122 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scthm\" (UniqueName: \"kubernetes.io/projected/a4909549-f2c4-45b0-a8f8-521302991297-kube-api-access-scthm\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168155 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4909549-f2c4-45b0-a8f8-521302991297-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.168167 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.201382 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9967357-b98f-4e31-9934-f99669b31024" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" exitCode=143 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221151 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221153 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.221279 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b5fb6fc5c-pj985" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.230691 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.234871 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.234891 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.234920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.234927 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.235250 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.235280 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4909549-f2c4-45b0-a8f8-521302991297" containerName="horizon-log" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.236341 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.238323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.240837 4883 generic.go:334] "Generic (PLEG): container finished" podID="a4909549-f2c4-45b0-a8f8-521302991297" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" exitCode=137 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.240990 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcc9bbb48-lf4jb" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241038 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241066 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcc9bbb48-lf4jb" event={"ID":"a4909549-f2c4-45b0-a8f8-521302991297","Type":"ContainerDied","Data":"023eb4b942e99478ddc5c2302dbb0ec5737ecdcfe04fb54667164182410590d3"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241088 4883 scope.go:117] "RemoveContainer" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.241960 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.245974 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.273183 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data" (OuterVolumeSpecName: "config-data") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.273309 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278624 4883 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278653 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4909549-f2c4-45b0-a8f8-521302991297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.278665 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.280308 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerID="9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" exitCode=143 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.280458 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.292694 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts" (OuterVolumeSpecName: "scripts") pod "a4909549-f2c4-45b0-a8f8-521302991297" (UID: "a4909549-f2c4-45b0-a8f8-521302991297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.299203 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.338730 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.338964 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f6f8846bd-rdwfd" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" containerID="cri-o://02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" gracePeriod=30 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.339382 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f6f8846bd-rdwfd" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" containerID="cri-o://5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" gracePeriod=30 Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.380757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.381177 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.381792 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4909549-f2c4-45b0-a8f8-521302991297-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.483402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.483760 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.484999 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.508008 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"nova-cell1-c052-account-create-update-hg4pd\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.556929 4883 scope.go:117] "RemoveContainer" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.582856 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.591052 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fcc9bbb48-lf4jb"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.604404 4883 scope.go:117] "RemoveContainer" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.609580 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": container with ID starting with ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317 not found: ID does not exist" containerID="ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.609650 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317"} err="failed to get container status \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": rpc error: code = NotFound desc = could not find container \"ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317\": container with ID starting with ec68859c7854aa4873133ec63d78db5fc7475e5ae5bae603cd65c95919677317 not found: ID does not exist" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.609697 4883 scope.go:117] "RemoveContainer" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: E0310 09:23:56.610865 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": container with ID starting with 5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff not found: ID does not exist" containerID="5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.610898 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff"} err="failed to get container status \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": rpc error: code = NotFound desc = could not find container \"5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff\": container with ID starting with 5977ce591d1a7b8fc08ee5122a2dcd22ff0ee078d4fb7bb5920d0296f94304ff not found: ID does not exist" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.675112 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.748984 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.763371 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.919690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:23:56 crc kubenswrapper[4883]: I0310 09:23:56.936788 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.056504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.185345 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.310677 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerStarted","Data":"328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.313603 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerStarted","Data":"ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.313704 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerStarted","Data":"6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316449 4883 generic.go:334] "Generic (PLEG): container finished" podID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerID="edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820" exitCode=0 Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316516 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerDied","Data":"edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.316573 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerStarted","Data":"458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.322596 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerStarted","Data":"d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.322679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerStarted","Data":"8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.328102 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerStarted","Data":"e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.328147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerStarted","Data":"e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.331361 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-f74b-account-create-update-lsxls" podStartSLOduration=2.33134565 podStartE2EDuration="2.33134565s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.324623205 +0000 UTC m=+1223.579521094" watchObservedRunningTime="2026-03-10 09:23:57.33134565 +0000 UTC m=+1223.586243540" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.331984 4883 generic.go:334] "Generic (PLEG): container finished" podID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" exitCode=0 Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.332044 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.336089 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.338200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerStarted","Data":"d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.338251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerStarted","Data":"5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649"} Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.357197 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4vxd6" podStartSLOduration=2.357177011 podStartE2EDuration="2.357177011s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.354162035 +0000 UTC m=+1223.609059925" watchObservedRunningTime="2026-03-10 09:23:57.357177011 +0000 UTC m=+1223.612074900" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.371772 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" podStartSLOduration=2.371753097 podStartE2EDuration="2.371753097s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.364066833 +0000 UTC m=+1223.618964722" watchObservedRunningTime="2026-03-10 09:23:57.371753097 +0000 UTC m=+1223.626650986" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.415436 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zr486" podStartSLOduration=2.415408528 podStartE2EDuration="2.415408528s" podCreationTimestamp="2026-03-10 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:23:57.399463931 +0000 UTC m=+1223.654361820" watchObservedRunningTime="2026-03-10 09:23:57.415408528 +0000 UTC m=+1223.670306417" Mar 10 09:23:57 crc kubenswrapper[4883]: I0310 09:23:57.739871 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.101116 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4909549-f2c4-45b0-a8f8-521302991297" path="/var/lib/kubelet/pods/a4909549-f2c4-45b0-a8f8-521302991297/volumes" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.167265 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9dd286b_6aa5_4525_a645_8e4ec79af348.slice/crio-911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.272351 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354036 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354153 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354226 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354280 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") pod \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\" (UID: \"4853248e-5cd4-4cf3-b9e7-b824fad23efe\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.354660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.355007 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs" (OuterVolumeSpecName: "logs") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.356684 4883 generic.go:334] "Generic (PLEG): container finished" podID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerID="d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.356774 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerDied","Data":"d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.357546 4883 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4853248e-5cd4-4cf3-b9e7-b824fad23efe-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.357561 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4853248e-5cd4-4cf3-b9e7-b824fad23efe-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.359885 4883 generic.go:334] "Generic (PLEG): container finished" podID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerID="911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.359932 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerDied","Data":"911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.360188 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts" (OuterVolumeSpecName: "scripts") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.360399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.361439 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerID="e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.361506 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerDied","Data":"e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.364620 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerID="a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.364664 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371011 4883 generic.go:334] "Generic (PLEG): container finished" podID="d355ddcd-9120-4436-84c4-928027e6ee33" containerID="ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerDied","Data":"ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.371763 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf" (OuterVolumeSpecName: "kube-api-access-22fbf") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "kube-api-access-22fbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377254 4883 generic.go:334] "Generic (PLEG): container finished" podID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" exitCode=137 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377363 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4853248e-5cd4-4cf3-b9e7-b824fad23efe","Type":"ContainerDied","Data":"987bc0cc38389c3b6190190ae5c16e3637e3022a5ee37c4c3bc24573be51664c"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377384 4883 scope.go:117] "RemoveContainer" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.377510 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.384331 4883 generic.go:334] "Generic (PLEG): container finished" podID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerID="d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637" exitCode=0 Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.384427 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerDied","Data":"d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.386086 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.393591 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.432559 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data" (OuterVolumeSpecName: "config-data") pod "4853248e-5cd4-4cf3-b9e7-b824fad23efe" (UID: "4853248e-5cd4-4cf3-b9e7-b824fad23efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.438209 4883 scope.go:117] "RemoveContainer" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.454838 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.457167 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6656f7cc-nv5pp" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458552 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458566 4883 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458575 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458583 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4853248e-5cd4-4cf3-b9e7-b824fad23efe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.458592 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22fbf\" (UniqueName: \"kubernetes.io/projected/4853248e-5cd4-4cf3-b9e7-b824fad23efe-kube-api-access-22fbf\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478204 4883 scope.go:117] "RemoveContainer" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.478600 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": container with ID starting with 4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5 not found: ID does not exist" containerID="4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478635 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5"} err="failed to get container status \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": rpc error: code = NotFound desc = could not find container \"4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5\": container with ID starting with 4017fde5288ec4e11ad4c7f44f5785572e9e05286d970ee60b9806f1f5bf00a5 not found: ID does not exist" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478654 4883 scope.go:117] "RemoveContainer" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.478844 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": container with ID starting with 0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3 not found: ID does not exist" containerID="0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.478859 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3"} err="failed to get container status \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": rpc error: code = NotFound desc = could not find container \"0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3\": container with ID starting with 0cc2b78213732dac7da6f8055411654aa74efbdb3eed8efca9b39869d75593c3 not found: ID does not exist" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.738041 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.750309 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.770704 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.771402 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771417 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: E0310 09:23:58.771444 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771465 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771962 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api-log" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.771992 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" containerName="cinder-api" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.773151 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.777843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.777912 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.778132 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.785392 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.853510 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.869757 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.869816 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870025 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870192 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870217 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870266 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870315 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.870333 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.874429 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971565 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971823 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971854 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") pod \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971872 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971919 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") pod \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\" (UID: \"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.971975 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972041 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972063 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") pod \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\" (UID: \"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3\") " Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972561 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972654 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972684 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972704 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972727 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972755 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972769 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972791 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.972816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.973035 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.973412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2caba5f6-d05e-437e-868c-952e8adf3278-logs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.976889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb" (OuterVolumeSpecName: "kube-api-access-z8mhb") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "kube-api-access-z8mhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.978090 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" (UID: "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.983874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.983935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2caba5f6-d05e-437e-868c-952e8adf3278-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.984284 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs" (OuterVolumeSpecName: "logs") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.984740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw" (OuterVolumeSpecName: "kube-api-access-tqqsw") pod "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" (UID: "e694b4cb-0aa6-46d5-b6be-039d6a92e4a8"). InnerVolumeSpecName "kube-api-access-tqqsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.985804 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.991405 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.991736 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.992270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-config-data-custom\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.993121 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.995954 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2caba5f6-d05e-437e-868c-952e8adf3278-scripts\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.995990 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts" (OuterVolumeSpecName: "scripts") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:58 crc kubenswrapper[4883]: I0310 09:23:58.996017 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmvk\" (UniqueName: \"kubernetes.io/projected/2caba5f6-d05e-437e-868c-952e8adf3278-kube-api-access-tgmvk\") pod \"cinder-api-0\" (UID: \"2caba5f6-d05e-437e-868c-952e8adf3278\") " pod="openstack/cinder-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.033652 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.041324 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data" (OuterVolumeSpecName: "config-data") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.071399 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" (UID: "4a8da6e1-ae37-44df-bbe8-aaaeda402bd3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077200 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077224 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077237 4883 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077250 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077259 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077267 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077276 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqqsw\" (UniqueName: \"kubernetes.io/projected/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8-kube-api-access-tqqsw\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077287 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8mhb\" (UniqueName: \"kubernetes.io/projected/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-kube-api-access-z8mhb\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077296 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.077303 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.092454 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.163886 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.179637 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403320 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-l9ldx" event={"ID":"e694b4cb-0aa6-46d5-b6be-039d6a92e4a8","Type":"ContainerDied","Data":"458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb"} Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403378 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="458dceb3e1cca2ec00511601cd3c6de401cd9e82c4e2ccda5ef9b21bd6f813bb" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.403468 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-l9ldx" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409359 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4a8da6e1-ae37-44df-bbe8-aaaeda402bd3","Type":"ContainerDied","Data":"4d280b27ae76beef2731a3863818dd720d9ca5f105e0f710f7b3f7d025052c9f"} Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409417 4883 scope.go:117] "RemoveContainer" containerID="a4d2619e8455e83dda94f13bca3f4af0a00dbc8f602e8d4ef336ec6c1e921f82" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.409672 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.449413 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.469176 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476447 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476897 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476916 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476925 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476932 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: E0310 09:23:59.476947 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.476953 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477163 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" containerName="mariadb-database-create" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477184 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-httpd" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.477194 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" containerName="glance-log" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.478159 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.482825 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.482973 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.486329 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.528954 4883 scope.go:117] "RemoveContainer" containerID="9942d2bf12f93ebc9815c517a342ac88745bd4ad2c848364cca8746de3f97630" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.592164 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594073 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594194 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594283 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594342 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594645 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.594694 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.632857 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 09:23:59 crc kubenswrapper[4883]: W0310 09:23:59.656402 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2caba5f6_d05e_437e_868c_952e8adf3278.slice/crio-601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b WatchSource:0}: Error finding container 601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b: Status 404 returned error can't find the container with id 601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700135 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700202 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.700236 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701036 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701101 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701164 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701217 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.701251 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.702623 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.702922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.705214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-logs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.705776 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.716253 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.723723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.727995 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.729710 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlr2j\" (UniqueName: \"kubernetes.io/projected/4c354fe8-851f-4cf4-bc13-e06dba0a1cc0-kube-api-access-zlr2j\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.750922 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0\") " pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.790766 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.807911 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.909642 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") pod \"d355ddcd-9120-4436-84c4-928027e6ee33\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.909994 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") pod \"d355ddcd-9120-4436-84c4-928027e6ee33\" (UID: \"d355ddcd-9120-4436-84c4-928027e6ee33\") " Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.911982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d355ddcd-9120-4436-84c4-928027e6ee33" (UID: "d355ddcd-9120-4436-84c4-928027e6ee33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:23:59 crc kubenswrapper[4883]: I0310 09:23:59.933784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp" (OuterVolumeSpecName: "kube-api-access-pmdwp") pod "d355ddcd-9120-4436-84c4-928027e6ee33" (UID: "d355ddcd-9120-4436-84c4-928027e6ee33"). InnerVolumeSpecName "kube-api-access-pmdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.016716 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d355ddcd-9120-4436-84c4-928027e6ee33-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.016748 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmdwp\" (UniqueName: \"kubernetes.io/projected/d355ddcd-9120-4436-84c4-928027e6ee33-kube-api-access-pmdwp\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.067886 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.073520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.076662 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.091933 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.094761 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4853248e-5cd4-4cf3-b9e7-b824fad23efe" path="/var/lib/kubelet/pods/4853248e-5cd4-4cf3-b9e7-b824fad23efe/volumes" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.095720 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8da6e1-ae37-44df-bbe8-aaaeda402bd3" path="/var/lib/kubelet/pods/4a8da6e1-ae37-44df-bbe8-aaaeda402bd3/volumes" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149372 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149816 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149850 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149857 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149870 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149878 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149897 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149902 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.149926 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.149933 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150178 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150232 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" containerName="mariadb-account-create-update" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150246 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.150265 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" containerName="mariadb-database-create" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.151136 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.154858 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.155016 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.155224 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.156060 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.213421 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.221114 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") pod \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.228000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") pod \"e9dd286b-6aa5-4525-a645-8e4ec79af348\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.233532 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") pod \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.233617 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") pod \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\" (UID: \"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234394 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") pod \"e39def71-60ef-4b2a-823b-1c5e89e02647\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234520 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") pod \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\" (UID: \"fdbd0859-6f93-4118-9e5b-2170ec3d43ad\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234571 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") pod \"e39def71-60ef-4b2a-823b-1c5e89e02647\" (UID: \"e39def71-60ef-4b2a-823b-1c5e89e02647\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.234800 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") pod \"e9dd286b-6aa5-4525-a645-8e4ec79af348\" (UID: \"e9dd286b-6aa5-4525-a645-8e4ec79af348\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.235425 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.236319 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e39def71-60ef-4b2a-823b-1c5e89e02647" (UID: "e39def71-60ef-4b2a-823b-1c5e89e02647"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.236884 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdbd0859-6f93-4118-9e5b-2170ec3d43ad" (UID: "fdbd0859-6f93-4118-9e5b-2170ec3d43ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.237552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" (UID: "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.238460 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9dd286b-6aa5-4525-a645-8e4ec79af348" (UID: "e9dd286b-6aa5-4525-a645-8e4ec79af348"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.239581 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g" (OuterVolumeSpecName: "kube-api-access-bll5g") pod "fdbd0859-6f93-4118-9e5b-2170ec3d43ad" (UID: "fdbd0859-6f93-4118-9e5b-2170ec3d43ad"). InnerVolumeSpecName "kube-api-access-bll5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.253707 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5" (OuterVolumeSpecName: "kube-api-access-vd5d5") pod "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" (UID: "f9eca221-08d6-4b22-8d5c-1cd9e95c65d9"). InnerVolumeSpecName "kube-api-access-vd5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.257917 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6" (OuterVolumeSpecName: "kube-api-access-gzwb6") pod "e39def71-60ef-4b2a-823b-1c5e89e02647" (UID: "e39def71-60ef-4b2a-823b-1c5e89e02647"). InnerVolumeSpecName "kube-api-access-gzwb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.271138 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p" (OuterVolumeSpecName: "kube-api-access-bxh9p") pod "e9dd286b-6aa5-4525-a645-8e4ec79af348" (UID: "e9dd286b-6aa5-4525-a645-8e4ec79af348"). InnerVolumeSpecName "kube-api-access-bxh9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337411 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337550 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337595 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337638 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337816 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337874 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.337954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs" (OuterVolumeSpecName: "logs") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338057 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") pod \"f9967357-b98f-4e31-9934-f99669b31024\" (UID: \"f9967357-b98f-4e31-9934-f99669b31024\") " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.338199 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342105 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342373 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342396 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f9967357-b98f-4e31-9934-f99669b31024-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342410 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9dd286b-6aa5-4525-a645-8e4ec79af348-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342422 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bll5g\" (UniqueName: \"kubernetes.io/projected/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-kube-api-access-bll5g\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342436 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxh9p\" (UniqueName: \"kubernetes.io/projected/e9dd286b-6aa5-4525-a645-8e4ec79af348-kube-api-access-bxh9p\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342447 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd5d5\" (UniqueName: \"kubernetes.io/projected/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-kube-api-access-vd5d5\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342458 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342467 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e39def71-60ef-4b2a-823b-1c5e89e02647-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342494 4883 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdbd0859-6f93-4118-9e5b-2170ec3d43ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.342504 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwb6\" (UniqueName: \"kubernetes.io/projected/e39def71-60ef-4b2a-823b-1c5e89e02647-kube-api-access-gzwb6\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.362218 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.369586 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"auto-csr-approver-29552244-zwxrg\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.388433 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts" (OuterVolumeSpecName: "scripts") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.401269 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4" (OuterVolumeSpecName: "kube-api-access-ff6w4") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "kube-api-access-ff6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.420981 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449011 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" event={"ID":"e9dd286b-6aa5-4525-a645-8e4ec79af348","Type":"ContainerDied","Data":"328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449084 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328d6abd25cfa7a71e9b0831cc5309de56da4eea4f989a50a43c9c33525a5879" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.449231 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c052-account-create-update-hg4pd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453051 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6w4\" (UniqueName: \"kubernetes.io/projected/f9967357-b98f-4e31-9934-f99669b31024-kube-api-access-ff6w4\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453522 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453585 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453658 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453815 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f74b-account-create-update-lsxls" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453819 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f74b-account-create-update-lsxls" event={"ID":"d355ddcd-9120-4436-84c4-928027e6ee33","Type":"ContainerDied","Data":"6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.453864 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d8e8a7ffeaf42a416ff3d5dced177bcd0777c66e72676ad42d73d1bfa28b123" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.461733 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerStarted","Data":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.461905 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" containerID="cri-o://be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462009 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462045 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" containerID="cri-o://f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462118 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" containerID="cri-o://21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.462231 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" containerID="cri-o://e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" gracePeriod=30 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482565 4883 generic.go:334] "Generic (PLEG): container finished" podID="f9967357-b98f-4e31-9934-f99669b31024" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" exitCode=0 Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482655 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f9967357-b98f-4e31-9934-f99669b31024","Type":"ContainerDied","Data":"71b700d498775842c5c3b3c9b8a10f0292a828bd6a69eebba134bc667b2b5df6"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482725 4883 scope.go:117] "RemoveContainer" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.482904 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.491044 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.491548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.495199 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.498435 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.888122954 podStartE2EDuration="11.498422377s" podCreationTimestamp="2026-03-10 09:23:49 +0000 UTC" firstStartedPulling="2026-03-10 09:23:54.97045563 +0000 UTC m=+1221.225353520" lastFinishedPulling="2026-03-10 09:23:59.580755054 +0000 UTC m=+1225.835652943" observedRunningTime="2026-03-10 09:24:00.483805014 +0000 UTC m=+1226.738702903" watchObservedRunningTime="2026-03-10 09:24:00.498422377 +0000 UTC m=+1226.753320266" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zr486" event={"ID":"e39def71-60ef-4b2a-823b-1c5e89e02647","Type":"ContainerDied","Data":"5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510757 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5149fd889fe5536a323a71b17dfaf7c9c619e260b3b58f02d0982a4f00153649" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.510841 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zr486" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.513327 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data" (OuterVolumeSpecName: "config-data") pod "f9967357-b98f-4e31-9934-f99669b31024" (UID: "f9967357-b98f-4e31-9934-f99669b31024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.513530 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"601c1138d87e47570d65bae4260a979f42c8ffaf3e8bce68acb093f5669efb1b"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520349 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4vxd6" event={"ID":"fdbd0859-6f93-4118-9e5b-2170ec3d43ad","Type":"ContainerDied","Data":"8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520388 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ad2396f2e7a9d3bbf9da675e807414641fe972e08a0b92dada5b43eb2016b98" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.520444 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4vxd6" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.527811 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" event={"ID":"f9eca221-08d6-4b22-8d5c-1cd9e95c65d9","Type":"ContainerDied","Data":"e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035"} Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.527841 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69d19127eb3d9ce4f3e34fa37eba123f18c99b6eaeaed465228d938b7758035" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.528124 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-15d6-account-create-update-lwjcj" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.528946 4883 scope.go:117] "RemoveContainer" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.529754 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.566965 4883 scope.go:117] "RemoveContainer" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.567383 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": container with ID starting with 93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032 not found: ID does not exist" containerID="93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.567418 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032"} err="failed to get container status \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": rpc error: code = NotFound desc = could not find container \"93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032\": container with ID starting with 93db55de600e7c57e3c6d41b54754666e2e3a0068cd5161376c6a0b2e73ce032 not found: ID does not exist" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.567460 4883 scope.go:117] "RemoveContainer" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.567966 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": container with ID starting with feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731 not found: ID does not exist" containerID="feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.568072 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731"} err="failed to get container status \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": rpc error: code = NotFound desc = could not find container \"feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731\": container with ID starting with feaa17ed8b6375a451bfb6d027bd48ce7029c3347eef4eb6fd36230c17b9a731 not found: ID does not exist" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571772 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571795 4883 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.571806 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9967357-b98f-4e31-9934-f99669b31024-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.853898 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.864522 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878073 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.878623 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878644 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: E0310 09:24:00.878674 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878681 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878867 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-httpd" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.878901 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9967357-b98f-4e31-9934-f99669b31024" containerName="glance-log" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.879959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.881973 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.882382 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.882541 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999762 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999839 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:00 crc kubenswrapper[4883]: I0310 09:24:00.999870 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000083 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000248 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000437 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.000507 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.027712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.103305 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.105724 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.105995 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106086 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106180 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106203 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106246 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.106716 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.103992 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.108122 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/676458e7-e4a0-4f1a-b200-0ab75faaddb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.110152 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.125681 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.144699 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.145466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.151056 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhwv\" (UniqueName: \"kubernetes.io/projected/676458e7-e4a0-4f1a-b200-0ab75faaddb4-kube-api-access-gfhwv\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.152614 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676458e7-e4a0-4f1a-b200-0ab75faaddb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.169565 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"676458e7-e4a0-4f1a-b200-0ab75faaddb4\") " pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.210422 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.273977 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415009 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415494 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415674 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.415804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") pod \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\" (UID: \"de4a8a41-06f6-4d5a-939c-22eebc30b0d8\") " Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.421393 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx" (OuterVolumeSpecName: "kube-api-access-xc9sx") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "kube-api-access-xc9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.430375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.468863 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config" (OuterVolumeSpecName: "config") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.483841 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.508665 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de4a8a41-06f6-4d5a-939c-22eebc30b0d8" (UID: "de4a8a41-06f6-4d5a-939c-22eebc30b0d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518715 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9sx\" (UniqueName: \"kubernetes.io/projected/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-kube-api-access-xc9sx\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518742 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518752 4883 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518761 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.518771 4883 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de4a8a41-06f6-4d5a-939c-22eebc30b0d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559256 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559300 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" exitCode=2 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559309 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.559400 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.573945 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"2d927a827f1b57a3c74a0466ad13ecf698a963090190150471c91fa25975bf4f"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.574008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"a2dbead07466ce40aa853155f09d4924420bedf2219e3d697667f7310f78ef12"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.577940 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerStarted","Data":"f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583861 4883 generic.go:334] "Generic (PLEG): container finished" podID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" exitCode=0 Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583955 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f6f8846bd-rdwfd" event={"ID":"de4a8a41-06f6-4d5a-939c-22eebc30b0d8","Type":"ContainerDied","Data":"98d045da7fa92d2ea6ec832a583b37763ca71714b8c37b66a7d614c5c8099df1"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583977 4883 scope.go:117] "RemoveContainer" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.583994 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f6f8846bd-rdwfd" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.607992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"e7708b6c2d732bd1f8d0d9576d52f4af9f622df129248fbf8312d95b061e492f"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.608021 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2caba5f6-d05e-437e-868c-952e8adf3278","Type":"ContainerStarted","Data":"eda0dc97c6e82a19cf11402d6161e9b3e6ba6c1878c7e9895582b4a78c155d76"} Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.608129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.661218 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.6611959450000002 podStartE2EDuration="3.661195945s" podCreationTimestamp="2026-03-10 09:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:01.64080287 +0000 UTC m=+1227.895700759" watchObservedRunningTime="2026-03-10 09:24:01.661195945 +0000 UTC m=+1227.916093834" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.679204 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.681096 4883 scope.go:117] "RemoveContainer" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.687705 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f6f8846bd-rdwfd"] Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.708646 4883 scope.go:117] "RemoveContainer" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: E0310 09:24:01.712669 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": container with ID starting with 5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4 not found: ID does not exist" containerID="5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.712727 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4"} err="failed to get container status \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": rpc error: code = NotFound desc = could not find container \"5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4\": container with ID starting with 5b0fff09aca44b401c5363538a6e5bcd43727cca3f4649544e8958b07995e2a4 not found: ID does not exist" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.712761 4883 scope.go:117] "RemoveContainer" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: E0310 09:24:01.716561 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": container with ID starting with 02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d not found: ID does not exist" containerID="02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.716598 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d"} err="failed to get container status \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": rpc error: code = NotFound desc = could not find container \"02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d\": container with ID starting with 02181980dc9889eaa91411409570c9d44a641e47cf30bca6537f10002e51d81d not found: ID does not exist" Mar 10 09:24:01 crc kubenswrapper[4883]: I0310 09:24:01.749891 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 09:24:01 crc kubenswrapper[4883]: W0310 09:24:01.758601 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676458e7_e4a0_4f1a_b200_0ab75faaddb4.slice/crio-28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95 WatchSource:0}: Error finding container 28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95: Status 404 returned error can't find the container with id 28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95 Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.098970 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" path="/var/lib/kubelet/pods/de4a8a41-06f6-4d5a-939c-22eebc30b0d8/volumes" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.100043 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9967357-b98f-4e31-9934-f99669b31024" path="/var/lib/kubelet/pods/f9967357-b98f-4e31-9934-f99669b31024/volumes" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.622964 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"beb67b616015fe9791d1ac986df0b2ba15e2cdca44d7b610c14f4bb40905ea5e"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.623251 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"28dccf89d163e87b333b06a1d113b6744cf5ce84a4006385fb3291cad97fab95"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.627147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c354fe8-851f-4cf4-bc13-e06dba0a1cc0","Type":"ContainerStarted","Data":"18e79717e974d3ac224da6f6ea6c6f16e46988561d4afeb1e5d571670e122dbd"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.635614 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerStarted","Data":"6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c"} Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.645222 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.645202203 podStartE2EDuration="3.645202203s" podCreationTimestamp="2026-03-10 09:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:02.644739581 +0000 UTC m=+1228.899637470" watchObservedRunningTime="2026-03-10 09:24:02.645202203 +0000 UTC m=+1228.900100092" Mar 10 09:24:02 crc kubenswrapper[4883]: I0310 09:24:02.658836 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" podStartSLOduration=1.663565262 podStartE2EDuration="2.65881887s" podCreationTimestamp="2026-03-10 09:24:00 +0000 UTC" firstStartedPulling="2026-03-10 09:24:01.042406682 +0000 UTC m=+1227.297304571" lastFinishedPulling="2026-03-10 09:24:02.037660289 +0000 UTC m=+1228.292558179" observedRunningTime="2026-03-10 09:24:02.656821243 +0000 UTC m=+1228.911719132" watchObservedRunningTime="2026-03-10 09:24:02.65881887 +0000 UTC m=+1228.913716759" Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.647172 4883 generic.go:334] "Generic (PLEG): container finished" podID="391543cc-519b-4e01-8886-04bde62c5298" containerID="6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c" exitCode=0 Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.647244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerDied","Data":"6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c"} Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.650170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"676458e7-e4a0-4f1a-b200-0ab75faaddb4","Type":"ContainerStarted","Data":"02a407f3ae262acf186ff2a707a604e3fc6f0360578ee27f45091e9469bb630c"} Mar 10 09:24:03 crc kubenswrapper[4883]: I0310 09:24:03.679888 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.679868761 podStartE2EDuration="3.679868761s" podCreationTimestamp="2026-03-10 09:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:03.676853495 +0000 UTC m=+1229.931751385" watchObservedRunningTime="2026-03-10 09:24:03.679868761 +0000 UTC m=+1229.934766649" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.215826 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.405996 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406092 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406117 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406178 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") pod \"2aae9177-76f0-4502-8f6a-19ad69a255ae\" (UID: \"2aae9177-76f0-4502-8f6a-19ad69a255ae\") " Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.406869 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.414022 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.417907 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8" (OuterVolumeSpecName: "kube-api-access-xs4v8") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "kube-api-access-xs4v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.434521 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts" (OuterVolumeSpecName: "scripts") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.441210 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.463188 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.486670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data" (OuterVolumeSpecName: "config-data") pod "2aae9177-76f0-4502-8f6a-19ad69a255ae" (UID: "2aae9177-76f0-4502-8f6a-19ad69a255ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510157 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510193 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4v8\" (UniqueName: \"kubernetes.io/projected/2aae9177-76f0-4502-8f6a-19ad69a255ae-kube-api-access-xs4v8\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510231 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510245 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510255 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510266 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aae9177-76f0-4502-8f6a-19ad69a255ae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.510276 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aae9177-76f0-4502-8f6a-19ad69a255ae-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664542 4883 generic.go:334] "Generic (PLEG): container finished" podID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" exitCode=0 Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664750 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664796 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aae9177-76f0-4502-8f6a-19ad69a255ae","Type":"ContainerDied","Data":"8662768bfd83f0fff77bedd9babc38ac435600a470cdeb93306da3bcece7d468"} Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.664837 4883 scope.go:117] "RemoveContainer" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.696649 4883 scope.go:117] "RemoveContainer" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.715719 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.730952 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737166 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737619 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737638 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737651 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737657 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737667 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737673 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737690 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737696 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737704 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737709 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.737718 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737725 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737895 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737913 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4a8a41-06f6-4d5a-939c-22eebc30b0d8" containerName="neutron-api" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737922 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="proxy-httpd" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737931 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-notification-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737941 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="sg-core" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.737952 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" containerName="ceilometer-central-agent" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.738525 4883 scope.go:117] "RemoveContainer" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.746922 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.747059 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.749538 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.749792 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.776582 4883 scope.go:117] "RemoveContainer" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.811370 4883 scope.go:117] "RemoveContainer" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.812327 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": container with ID starting with f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db not found: ID does not exist" containerID="f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.812375 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db"} err="failed to get container status \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": rpc error: code = NotFound desc = could not find container \"f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db\": container with ID starting with f3942109f1794a02f5ee4fd9beffa45f51061376233b20996bd68bde8c5428db not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.812406 4883 scope.go:117] "RemoveContainer" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813003 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": container with ID starting with e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b not found: ID does not exist" containerID="e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813046 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b"} err="failed to get container status \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": rpc error: code = NotFound desc = could not find container \"e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b\": container with ID starting with e184f8847ceb04ce06136429467a2903069536e5e305a510cf43861f6230770b not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813063 4883 scope.go:117] "RemoveContainer" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813420 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": container with ID starting with 21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a not found: ID does not exist" containerID="21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813447 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a"} err="failed to get container status \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": rpc error: code = NotFound desc = could not find container \"21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a\": container with ID starting with 21fb9597f73eb95f25b732901a219764e3b0f931a29b867e7744d5cc8788ef5a not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813463 4883 scope.go:117] "RemoveContainer" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: E0310 09:24:04.813908 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": container with ID starting with be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7 not found: ID does not exist" containerID="be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.813950 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7"} err="failed to get container status \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": rpc error: code = NotFound desc = could not find container \"be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7\": container with ID starting with be483f7e93c7ed77cc3852497bdb9d4f5d964f56fc828e53abd0232a264b80d7 not found: ID does not exist" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921130 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921282 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921305 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921341 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921365 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.921902 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:04 crc kubenswrapper[4883]: I0310 09:24:04.968942 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024533 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024678 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024740 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024776 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024812 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.024888 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.025497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.025518 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.031652 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.032525 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.033061 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.034994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.045616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"ceilometer-0\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.065989 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.126431 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") pod \"391543cc-519b-4e01-8886-04bde62c5298\" (UID: \"391543cc-519b-4e01-8886-04bde62c5298\") " Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.130507 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx" (OuterVolumeSpecName: "kube-api-access-hvpgx") pod "391543cc-519b-4e01-8886-04bde62c5298" (UID: "391543cc-519b-4e01-8886-04bde62c5298"). InnerVolumeSpecName "kube-api-access-hvpgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.231744 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvpgx\" (UniqueName: \"kubernetes.io/projected/391543cc-519b-4e01-8886-04bde62c5298-kube-api-access-hvpgx\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.511291 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:05 crc kubenswrapper[4883]: W0310 09:24:05.514727 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f8485ae_b380_4555_8f4a_a71544094774.slice/crio-b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe WatchSource:0}: Error finding container b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe: Status 404 returned error can't find the container with id b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679536 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" event={"ID":"391543cc-519b-4e01-8886-04bde62c5298","Type":"ContainerDied","Data":"f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0"} Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679615 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33d45e49c487b98183faf1adc73353bffd3605e2e4034356d70d18beabca3f0" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.679553 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-zwxrg" Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.691277 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe"} Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.718987 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:24:05 crc kubenswrapper[4883]: I0310 09:24:05.728219 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-qbbs2"] Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.090723 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aae9177-76f0-4502-8f6a-19ad69a255ae" path="/var/lib/kubelet/pods/2aae9177-76f0-4502-8f6a-19ad69a255ae/volumes" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.092023 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38171111-f624-438d-ba5a-36f6b9cb29bf" path="/var/lib/kubelet/pods/38171111-f624-438d-ba5a-36f6b9cb29bf/volumes" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.251497 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:06 crc kubenswrapper[4883]: E0310 09:24:06.252189 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.252209 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.252402 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="391543cc-519b-4e01-8886-04bde62c5298" containerName="oc" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.254322 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.256874 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.257344 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.270229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hmfjf" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.280035 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.354906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355119 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.355178 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456674 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456720 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456753 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.456830 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.462460 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.463281 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.470140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.484875 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"nova-cell0-conductor-db-sync-pghj7\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.588782 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:06 crc kubenswrapper[4883]: I0310 09:24:06.703707 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.001686 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:24:07 crc kubenswrapper[4883]: W0310 09:24:07.013661 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c8e962_9007_49e1_bd9f_d822e9100291.slice/crio-476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771 WatchSource:0}: Error finding container 476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771: Status 404 returned error can't find the container with id 476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771 Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.714831 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerStarted","Data":"476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.716797 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} Mar 10 09:24:07 crc kubenswrapper[4883]: I0310 09:24:07.716870 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.809246 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.809551 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.851775 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:24:09 crc kubenswrapper[4883]: I0310 09:24:09.854885 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.760559 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerStarted","Data":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761014 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761036 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.761049 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.792680 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.686281202 podStartE2EDuration="6.792650822s" podCreationTimestamp="2026-03-10 09:24:04 +0000 UTC" firstStartedPulling="2026-03-10 09:24:05.517829641 +0000 UTC m=+1231.772727520" lastFinishedPulling="2026-03-10 09:24:09.624199252 +0000 UTC m=+1235.879097140" observedRunningTime="2026-03-10 09:24:10.779432216 +0000 UTC m=+1237.034330105" watchObservedRunningTime="2026-03-10 09:24:10.792650822 +0000 UTC m=+1237.047548701" Mar 10 09:24:10 crc kubenswrapper[4883]: I0310 09:24:10.912882 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.211810 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.212080 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.245625 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.247041 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.784246 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:11 crc kubenswrapper[4883]: I0310 09:24:11.784338 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:12 crc kubenswrapper[4883]: I0310 09:24:12.511047 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:24:12 crc kubenswrapper[4883]: I0310 09:24:12.625334 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.793706 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.827376 4883 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 09:24:13 crc kubenswrapper[4883]: I0310 09:24:13.957885 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.917726 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918027 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" containerID="cri-o://0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918583 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" containerID="cri-o://fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918662 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" containerID="cri-o://d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" gracePeriod=30 Mar 10 09:24:14 crc kubenswrapper[4883]: I0310 09:24:14.918710 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" containerID="cri-o://766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" gracePeriod=30 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.556748 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.666705 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.667118 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.667268 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668004 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668175 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668238 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668354 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") pod \"0f8485ae-b380-4555-8f4a-a71544094774\" (UID: \"0f8485ae-b380-4555-8f4a-a71544094774\") " Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.668963 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.669030 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.670725 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.670755 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f8485ae-b380-4555-8f4a-a71544094774-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.676337 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts" (OuterVolumeSpecName: "scripts") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.679186 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2" (OuterVolumeSpecName: "kube-api-access-sxwp2") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "kube-api-access-sxwp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.693155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.744254 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.750535 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data" (OuterVolumeSpecName: "config-data") pod "0f8485ae-b380-4555-8f4a-a71544094774" (UID: "0f8485ae-b380-4555-8f4a-a71544094774"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773033 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773067 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773077 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773090 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwp2\" (UniqueName: \"kubernetes.io/projected/0f8485ae-b380-4555-8f4a-a71544094774-kube-api-access-sxwp2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.773098 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8485ae-b380-4555-8f4a-a71544094774-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.847093 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerStarted","Data":"996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.849983 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850010 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" exitCode=2 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850018 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850026 4883 generic.go:334] "Generic (PLEG): container finished" podID="0f8485ae-b380-4555-8f4a-a71544094774" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" exitCode=0 Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850028 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850057 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850136 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f8485ae-b380-4555-8f4a-a71544094774","Type":"ContainerDied","Data":"b4032a0a620f17cd3f9ee7e418f4d03002b4d54e275f86e2be2b1e53a4c27ebe"} Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.850152 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.869914 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.874689 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pghj7" podStartSLOduration=1.588737966 podStartE2EDuration="9.874674135s" podCreationTimestamp="2026-03-10 09:24:06 +0000 UTC" firstStartedPulling="2026-03-10 09:24:07.015086464 +0000 UTC m=+1233.269984354" lastFinishedPulling="2026-03-10 09:24:15.301022634 +0000 UTC m=+1241.555920523" observedRunningTime="2026-03-10 09:24:15.867834437 +0000 UTC m=+1242.122732327" watchObservedRunningTime="2026-03-10 09:24:15.874674135 +0000 UTC m=+1242.129572024" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.887951 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.897731 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.908111 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.912185 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.917695 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918145 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918163 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918172 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918178 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918195 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918201 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.918232 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918237 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918402 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="proxy-httpd" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918417 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="sg-core" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918425 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-central-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.918440 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8485ae-b380-4555-8f4a-a71544094774" containerName="ceilometer-notification-agent" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.925453 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.932207 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.932372 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.949577 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.954031 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.954081 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.954115 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.958382 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958646 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958716 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.958394 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.959133 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959190 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959222 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: E0310 09:24:15.959782 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959814 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.959831 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960357 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960404 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960681 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.960715 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964418 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964448 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.964998 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965048 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965410 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965442 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965726 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.965745 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966012 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966031 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966240 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966258 4883 scope.go:117] "RemoveContainer" containerID="fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966486 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9"} err="failed to get container status \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": rpc error: code = NotFound desc = could not find container \"fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9\": container with ID starting with fb018b75c62a3105b390ca82d41f7d5ced220f915f7e504b16b2cc66d9443da9 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966507 4883 scope.go:117] "RemoveContainer" containerID="d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966688 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578"} err="failed to get container status \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": rpc error: code = NotFound desc = could not find container \"d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578\": container with ID starting with d6775f0203476e6a515e8a7c4fa893480489df70dd4d57d83db00456da132578 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966707 4883 scope.go:117] "RemoveContainer" containerID="766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966876 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3"} err="failed to get container status \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": rpc error: code = NotFound desc = could not find container \"766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3\": container with ID starting with 766c6c039837596b0a45d25097b4a2e6554bb918554b9d5d91dd3446107750e3 not found: ID does not exist" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.966895 4883 scope.go:117] "RemoveContainer" containerID="0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279" Mar 10 09:24:15 crc kubenswrapper[4883]: I0310 09:24:15.967057 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279"} err="failed to get container status \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": rpc error: code = NotFound desc = could not find container \"0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279\": container with ID starting with 0b64fa5dbed6ef8789c8cb5c0a14a18d9778631be4244e0545440f199fb4d279 not found: ID does not exist" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079281 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079329 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079369 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079456 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079634 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.079734 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.091515 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8485ae-b380-4555-8f4a-a71544094774" path="/var/lib/kubelet/pods/0f8485ae-b380-4555-8f4a-a71544094774/volumes" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182380 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182427 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182500 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182672 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182717 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.182761 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.183789 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.184269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.189372 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.189828 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.190376 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.190590 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.202049 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"ceilometer-0\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.241789 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.524836 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.670990 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.678407 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:24:16 crc kubenswrapper[4883]: I0310 09:24:16.862309 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"bee0f47d7b61a9dac31a1270d716c8980786df32f0465319473f68681f0f03f9"} Mar 10 09:24:17 crc kubenswrapper[4883]: I0310 09:24:17.888423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} Mar 10 09:24:18 crc kubenswrapper[4883]: I0310 09:24:18.898435 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} Mar 10 09:24:18 crc kubenswrapper[4883]: I0310 09:24:18.898909 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942549 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerStarted","Data":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.943155 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942916 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" containerID="cri-o://8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942684 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" containerID="cri-o://cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942928 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" containerID="cri-o://c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.942940 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" containerID="cri-o://c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" gracePeriod=30 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.945596 4883 generic.go:334] "Generic (PLEG): container finished" podID="46c8e962-9007-49e1-bd9f-d822e9100291" containerID="996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e" exitCode=0 Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.945652 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerDied","Data":"996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e"} Mar 10 09:24:21 crc kubenswrapper[4883]: I0310 09:24:21.964369 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.437312297 podStartE2EDuration="6.964342091s" podCreationTimestamp="2026-03-10 09:24:15 +0000 UTC" firstStartedPulling="2026-03-10 09:24:16.677976729 +0000 UTC m=+1242.932874618" lastFinishedPulling="2026-03-10 09:24:21.205006523 +0000 UTC m=+1247.459904412" observedRunningTime="2026-03-10 09:24:21.957582765 +0000 UTC m=+1248.212480654" watchObservedRunningTime="2026-03-10 09:24:21.964342091 +0000 UTC m=+1248.219239980" Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957104 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" exitCode=0 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957873 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" exitCode=2 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957962 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" exitCode=0 Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.957179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.958074 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} Mar 10 09:24:22 crc kubenswrapper[4883]: I0310 09:24:22.958095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.240670 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436078 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436229 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436384 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.436450 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") pod \"46c8e962-9007-49e1-bd9f-d822e9100291\" (UID: \"46c8e962-9007-49e1-bd9f-d822e9100291\") " Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.442599 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts" (OuterVolumeSpecName: "scripts") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.443803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8" (OuterVolumeSpecName: "kube-api-access-4nmp8") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "kube-api-access-4nmp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.479557 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data" (OuterVolumeSpecName: "config-data") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.484664 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46c8e962-9007-49e1-bd9f-d822e9100291" (UID: "46c8e962-9007-49e1-bd9f-d822e9100291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540151 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540257 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nmp8\" (UniqueName: \"kubernetes.io/projected/46c8e962-9007-49e1-bd9f-d822e9100291-kube-api-access-4nmp8\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540358 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.540373 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c8e962-9007-49e1-bd9f-d822e9100291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.969514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pghj7" event={"ID":"46c8e962-9007-49e1-bd9f-d822e9100291","Type":"ContainerDied","Data":"476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771"} Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.970553 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476b4874dc1fbb37d06665527cae7ca6640ff7b5cda30363d5e889fac818a771" Mar 10 09:24:23 crc kubenswrapper[4883]: I0310 09:24:23.969807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pghj7" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.104834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: E0310 09:24:24.105173 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.105190 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.105372 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" containerName="nova-cell0-conductor-db-sync" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.106160 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.107804 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.109314 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hmfjf" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.121927 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254352 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.254423 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356729 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356809 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.356922 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.362034 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.362135 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19096ebe-3796-4e22-a477-45d3e635a80a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.372182 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wfk\" (UniqueName: \"kubernetes.io/projected/19096ebe-3796-4e22-a477-45d3e635a80a-kube-api-access-c9wfk\") pod \"nova-cell0-conductor-0\" (UID: \"19096ebe-3796-4e22-a477-45d3e635a80a\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.404785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.419959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.559976 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560090 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560302 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560558 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560604 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.560677 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") pod \"71a316ea-2390-4458-aacb-c7b7b52030dd\" (UID: \"71a316ea-2390-4458-aacb-c7b7b52030dd\") " Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.561159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.561627 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.562333 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.562366 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71a316ea-2390-4458-aacb-c7b7b52030dd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.564669 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts" (OuterVolumeSpecName: "scripts") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.577874 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp" (OuterVolumeSpecName: "kube-api-access-c4hdp") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "kube-api-access-c4hdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.589698 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.629716 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.634910 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data" (OuterVolumeSpecName: "config-data") pod "71a316ea-2390-4458-aacb-c7b7b52030dd" (UID: "71a316ea-2390-4458-aacb-c7b7b52030dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665030 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665055 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hdp\" (UniqueName: \"kubernetes.io/projected/71a316ea-2390-4458-aacb-c7b7b52030dd-kube-api-access-c4hdp\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665068 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665080 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.665089 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a316ea-2390-4458-aacb-c7b7b52030dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:24 crc kubenswrapper[4883]: W0310 09:24:24.838023 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19096ebe_3796_4e22_a477_45d3e635a80a.slice/crio-75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d WatchSource:0}: Error finding container 75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d: Status 404 returned error can't find the container with id 75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.839035 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983760 4883 generic.go:334] "Generic (PLEG): container finished" podID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" exitCode=0 Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983864 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983885 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983960 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71a316ea-2390-4458-aacb-c7b7b52030dd","Type":"ContainerDied","Data":"bee0f47d7b61a9dac31a1270d716c8980786df32f0465319473f68681f0f03f9"} Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.983988 4883 scope.go:117] "RemoveContainer" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:24 crc kubenswrapper[4883]: I0310 09:24:24.986329 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19096ebe-3796-4e22-a477-45d3e635a80a","Type":"ContainerStarted","Data":"75524614d6b68118e286c110d9aa73f78a9be830790220f146be33f8f77b6d0d"} Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.008088 4883 scope.go:117] "RemoveContainer" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.019012 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.026402 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.033499 4883 scope.go:117] "RemoveContainer" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.035792 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036136 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036149 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036165 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036171 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036182 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036188 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.036208 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036213 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036467 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="sg-core" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036498 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-notification-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036505 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="proxy-httpd" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.036518 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" containerName="ceilometer-central-agent" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.038392 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.039762 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.040821 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.049308 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074630 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074763 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.074842 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075099 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075187 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.075221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.077941 4883 scope.go:117] "RemoveContainer" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096076 4883 scope.go:117] "RemoveContainer" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.096382 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": container with ID starting with 8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a not found: ID does not exist" containerID="8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096412 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a"} err="failed to get container status \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": rpc error: code = NotFound desc = could not find container \"8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a\": container with ID starting with 8ef040c00f0a48c4934a6c6c62dd47141bd4385bfc680f727a5e2eae78f81f3a not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096433 4883 scope.go:117] "RemoveContainer" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.096748 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": container with ID starting with c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0 not found: ID does not exist" containerID="c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0"} err="failed to get container status \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": rpc error: code = NotFound desc = could not find container \"c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0\": container with ID starting with c81c2750915424048c2a8f9c5c1038f5d20f7d22572b567ffa4047098b57dff0 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.096788 4883 scope.go:117] "RemoveContainer" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.097115 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": container with ID starting with c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9 not found: ID does not exist" containerID="c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097136 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9"} err="failed to get container status \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": rpc error: code = NotFound desc = could not find container \"c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9\": container with ID starting with c0841c106eb6fec3180d26f6dff59cc4d3e2f96efa730ac3adf4bbb03c90a1f9 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097150 4883 scope.go:117] "RemoveContainer" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: E0310 09:24:25.097335 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": container with ID starting with cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6 not found: ID does not exist" containerID="cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.097354 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6"} err="failed to get container status \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": rpc error: code = NotFound desc = could not find container \"cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6\": container with ID starting with cbf44024ea261df16089cc9530c9332bc422fd370f28b6b027bda40a394fe3d6 not found: ID does not exist" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177895 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177958 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.177990 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178073 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178097 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178120 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.178399 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.184335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.184589 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185237 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.185335 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.193750 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"ceilometer-0\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.360651 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.757497 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:24:25 crc kubenswrapper[4883]: W0310 09:24:25.771238 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13fc6b71_b633_4726_ad0d_91a04b592d3b.slice/crio-e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8 WatchSource:0}: Error finding container e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8: Status 404 returned error can't find the container with id e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8 Mar 10 09:24:25 crc kubenswrapper[4883]: I0310 09:24:25.998785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8"} Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.002404 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"19096ebe-3796-4e22-a477-45d3e635a80a","Type":"ContainerStarted","Data":"c801de5a1fe88d8ceae751b1030db246ddaff2437d3b3776ac62681445fa6afb"} Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.002636 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:26 crc kubenswrapper[4883]: I0310 09:24:26.091725 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a316ea-2390-4458-aacb-c7b7b52030dd" path="/var/lib/kubelet/pods/71a316ea-2390-4458-aacb-c7b7b52030dd/volumes" Mar 10 09:24:27 crc kubenswrapper[4883]: I0310 09:24:27.009983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} Mar 10 09:24:28 crc kubenswrapper[4883]: I0310 09:24:28.023179 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} Mar 10 09:24:28 crc kubenswrapper[4883]: I0310 09:24:28.023570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.451824 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.471898 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=5.471882313 podStartE2EDuration="5.471882313s" podCreationTimestamp="2026-03-10 09:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:26.024246495 +0000 UTC m=+1252.279144405" watchObservedRunningTime="2026-03-10 09:24:29.471882313 +0000 UTC m=+1255.726780202" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.889554 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.891062 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.894684 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.896248 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 09:24:29 crc kubenswrapper[4883]: I0310 09:24:29.900429 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.068334 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.069752 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.071964 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086563 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086653 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.086684 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.106789 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.107776 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.110784 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.117541 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.150528 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189402 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189440 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189543 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189581 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189643 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.189669 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.195533 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.210906 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.226974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.238683 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.240990 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"nova-cell0-cell-mapping-rptkb\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.248449 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.254836 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.272932 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292509 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292633 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292700 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292751 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.292768 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.299707 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.300908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.324090 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"nova-cell1-novncproxy-0\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.341520 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.343130 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.347153 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.407044 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408268 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408433 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408493 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.408683 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.412011 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.427517 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.433548 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.449359 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"nova-scheduler-0\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.502525 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.504217 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510366 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510435 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510539 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510579 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510605 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510703 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.510727 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.511133 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.519572 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.521434 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.535047 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.541994 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"nova-api-0\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.554737 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.595673 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617230 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617304 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617378 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617424 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617508 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617585 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617735 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.617778 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.620289 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.622852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.623903 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.638342 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"nova-metadata-0\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.664288 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720069 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720518 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720581 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.720693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.721591 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.722207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723007 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723411 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.723959 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.724649 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.724733 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.743809 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"dnsmasq-dns-7bd5679c8c-sqzrs\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:30 crc kubenswrapper[4883]: I0310 09:24:30.848553 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.025008 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.120122 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.136060 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerStarted","Data":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.136224 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.179716 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.214537578 podStartE2EDuration="6.179696798s" podCreationTimestamp="2026-03-10 09:24:25 +0000 UTC" firstStartedPulling="2026-03-10 09:24:25.775633696 +0000 UTC m=+1252.030531586" lastFinishedPulling="2026-03-10 09:24:29.740792917 +0000 UTC m=+1255.995690806" observedRunningTime="2026-03-10 09:24:31.162189826 +0000 UTC m=+1257.417087715" watchObservedRunningTime="2026-03-10 09:24:31.179696798 +0000 UTC m=+1257.434594688" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.196460 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.198848 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.200554 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.200977 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.219666 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248794 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248893 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.248957 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.249102 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.258991 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47cb531f_d85b_41f0_9608_a19b158679c7.slice/crio-267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf WatchSource:0}: Error finding container 267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf: Status 404 returned error can't find the container with id 267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.259651 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.350873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.351591 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.351856 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.352021 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.356602 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.357417 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.363117 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.367531 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"nova-cell1-conductor-db-sync-s66f5\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.489816 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.528829 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.610878 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e8e02c2_3c36_440a_b7aa_d39b27f3bd32.slice/crio-3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b WatchSource:0}: Error finding container 3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b: Status 404 returned error can't find the container with id 3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.612952 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.628968 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:24:31 crc kubenswrapper[4883]: W0310 09:24:31.642005 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabc326c8_0db0_4645_b1dc_3871b1b4202c.slice/crio-48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8 WatchSource:0}: Error finding container 48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8: Status 404 returned error can't find the container with id 48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8 Mar 10 09:24:31 crc kubenswrapper[4883]: I0310 09:24:31.967975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.149523 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerStarted","Data":"3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.152295 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerStarted","Data":"1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.152356 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerStarted","Data":"93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159529 4883 generic.go:334] "Generic (PLEG): container finished" podID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" exitCode=0 Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159642 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.159679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerStarted","Data":"48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.166169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerStarted","Data":"4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.167308 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerStarted","Data":"b5a77ffb6de872f70305d30c9087d34b77d39ad804e7a809e3b0b8a5a62a2dd7"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.172267 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"784b466f36c208110a836e6f45ef305de71d86ec8154db38b7751f2d884445c5"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.177095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf"} Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.178749 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rptkb" podStartSLOduration=3.17873298 podStartE2EDuration="3.17873298s" podCreationTimestamp="2026-03-10 09:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:32.176931093 +0000 UTC m=+1258.431828981" watchObservedRunningTime="2026-03-10 09:24:32.17873298 +0000 UTC m=+1258.433630869" Mar 10 09:24:32 crc kubenswrapper[4883]: I0310 09:24:32.194419 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-s66f5" podStartSLOduration=1.1943984300000001 podStartE2EDuration="1.19439843s" podCreationTimestamp="2026-03-10 09:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:32.193892335 +0000 UTC m=+1258.448790225" watchObservedRunningTime="2026-03-10 09:24:32.19439843 +0000 UTC m=+1258.449296318" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.192794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerStarted","Data":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.193293 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.198095 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerStarted","Data":"9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8"} Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.677097 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" podStartSLOduration=3.677078519 podStartE2EDuration="3.677078519s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:33.21520347 +0000 UTC m=+1259.470101359" watchObservedRunningTime="2026-03-10 09:24:33.677078519 +0000 UTC m=+1259.931976408" Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.679645 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:33 crc kubenswrapper[4883]: I0310 09:24:33.690650 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.210365 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerStarted","Data":"fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25"} Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.213801 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerStarted","Data":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.236106 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.504893738 podStartE2EDuration="4.236084343s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.614137275 +0000 UTC m=+1257.869035164" lastFinishedPulling="2026-03-10 09:24:33.34532788 +0000 UTC m=+1259.600225769" observedRunningTime="2026-03-10 09:24:34.223676216 +0000 UTC m=+1260.478574095" watchObservedRunningTime="2026-03-10 09:24:34.236084343 +0000 UTC m=+1260.490982232" Mar 10 09:24:34 crc kubenswrapper[4883]: I0310 09:24:34.260383 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.35230112 podStartE2EDuration="4.260367824s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.087343182 +0000 UTC m=+1257.342241070" lastFinishedPulling="2026-03-10 09:24:32.995409884 +0000 UTC m=+1259.250307774" observedRunningTime="2026-03-10 09:24:34.240909562 +0000 UTC m=+1260.495807451" watchObservedRunningTime="2026-03-10 09:24:34.260367824 +0000 UTC m=+1260.515265712" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.223044 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" gracePeriod=30 Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.408192 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.724341 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.843520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.895912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.896018 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.896170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") pod \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\" (UID: \"b36d321f-f1b6-425e-abdf-61478d9ccf1a\") " Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.899516 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz" (OuterVolumeSpecName: "kube-api-access-mslfz") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "kube-api-access-mslfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.916438 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data" (OuterVolumeSpecName: "config-data") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.917454 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b36d321f-f1b6-425e-abdf-61478d9ccf1a" (UID: "b36d321f-f1b6-425e-abdf-61478d9ccf1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998313 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998352 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b36d321f-f1b6-425e-abdf-61478d9ccf1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:35 crc kubenswrapper[4883]: I0310 09:24:35.998368 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mslfz\" (UniqueName: \"kubernetes.io/projected/b36d321f-f1b6-425e-abdf-61478d9ccf1a-kube-api-access-mslfz\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.236057 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.238248 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerStarted","Data":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.239663 4883 generic.go:334] "Generic (PLEG): container finished" podID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerID="9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8" exitCode=0 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.239824 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerDied","Data":"9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243692 4883 generic.go:334] "Generic (PLEG): container finished" podID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" exitCode=0 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243807 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerDied","Data":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243840 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b36d321f-f1b6-425e-abdf-61478d9ccf1a","Type":"ContainerDied","Data":"b5a77ffb6de872f70305d30c9087d34b77d39ad804e7a809e3b0b8a5a62a2dd7"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.243862 4883 scope.go:117] "RemoveContainer" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.244136 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.248672 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" containerID="cri-o://5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" gracePeriod=30 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.248987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.251008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerStarted","Data":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.249184 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" containerID="cri-o://ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" gracePeriod=30 Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.259494 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.198673662 podStartE2EDuration="6.259459359s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.273172051 +0000 UTC m=+1257.528069940" lastFinishedPulling="2026-03-10 09:24:35.333957748 +0000 UTC m=+1261.588855637" observedRunningTime="2026-03-10 09:24:36.254461915 +0000 UTC m=+1262.509359804" watchObservedRunningTime="2026-03-10 09:24:36.259459359 +0000 UTC m=+1262.514357248" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.273241 4883 scope.go:117] "RemoveContainer" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: E0310 09:24:36.273673 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": container with ID starting with 27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd not found: ID does not exist" containerID="27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.273716 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd"} err="failed to get container status \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": rpc error: code = NotFound desc = could not find container \"27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd\": container with ID starting with 27df3ec458d492727b88fd24e69c3f46a3b62107091616065d8957a9970215bd not found: ID does not exist" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.278245 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.289672 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.314774 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: E0310 09:24:36.315308 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.315330 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.315593 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.316329 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.317987 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.48274827 podStartE2EDuration="6.317968367s" podCreationTimestamp="2026-03-10 09:24:30 +0000 UTC" firstStartedPulling="2026-03-10 09:24:31.492320711 +0000 UTC m=+1257.747218600" lastFinishedPulling="2026-03-10 09:24:35.327540818 +0000 UTC m=+1261.582438697" observedRunningTime="2026-03-10 09:24:36.293899601 +0000 UTC m=+1262.548797490" watchObservedRunningTime="2026-03-10 09:24:36.317968367 +0000 UTC m=+1262.572866257" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318748 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318838 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.318764 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.325351 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.405907 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.405978 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406027 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406193 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.406230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507603 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507653 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507706 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.507730 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.513373 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.518034 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.518709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.522065 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.524706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5kz\" (UniqueName: \"kubernetes.io/projected/2c5d710c-62fb-4a8c-8a5c-ec6709017c75-kube-api-access-br5kz\") pod \"nova-cell1-novncproxy-0\" (UID: \"2c5d710c-62fb-4a8c-8a5c-ec6709017c75\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.631944 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.734160 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829557 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829611 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829694 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.829716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") pod \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\" (UID: \"d18d7e6b-feee-4222-a8fd-c13c0c70db2a\") " Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.830152 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs" (OuterVolumeSpecName: "logs") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.830765 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.833546 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn" (OuterVolumeSpecName: "kube-api-access-kldfn") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "kube-api-access-kldfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.854159 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data" (OuterVolumeSpecName: "config-data") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.858253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d18d7e6b-feee-4222-a8fd-c13c0c70db2a" (UID: "d18d7e6b-feee-4222-a8fd-c13c0c70db2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933198 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933232 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kldfn\" (UniqueName: \"kubernetes.io/projected/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-kube-api-access-kldfn\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:36 crc kubenswrapper[4883]: I0310 09:24:36.933246 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18d7e6b-feee-4222-a8fd-c13c0c70db2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.062967 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: W0310 09:24:37.063201 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c5d710c_62fb_4a8c_8a5c_ec6709017c75.slice/crio-b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe WatchSource:0}: Error finding container b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe: Status 404 returned error can't find the container with id b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258530 4883 generic.go:334] "Generic (PLEG): container finished" podID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" exitCode=0 Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258749 4883 generic.go:334] "Generic (PLEG): container finished" podID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" exitCode=143 Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258786 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258810 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258820 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d18d7e6b-feee-4222-a8fd-c13c0c70db2a","Type":"ContainerDied","Data":"784b466f36c208110a836e6f45ef305de71d86ec8154db38b7751f2d884445c5"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258834 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.258915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.264042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c5d710c-62fb-4a8c-8a5c-ec6709017c75","Type":"ContainerStarted","Data":"febf65f20b57000a3d173a143e3b6c0d43fb2e01df47333228b40fc340e619a9"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.264068 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2c5d710c-62fb-4a8c-8a5c-ec6709017c75","Type":"ContainerStarted","Data":"b5748be1a30c2c0759c323ba2e951830ad159416d7d5aa62229441f92ae75abe"} Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.286466 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.2864539910000001 podStartE2EDuration="1.286453991s" podCreationTimestamp="2026-03-10 09:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:37.28084306 +0000 UTC m=+1263.535740949" watchObservedRunningTime="2026-03-10 09:24:37.286453991 +0000 UTC m=+1263.541351879" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.306613 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.310549 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.330811 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.348604 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353112 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353151 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} err="failed to get container status \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353182 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353245 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353731 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353750 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.353762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353771 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353969 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-metadata" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.353996 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" containerName="nova-metadata-log" Mar 10 09:24:37 crc kubenswrapper[4883]: E0310 09:24:37.354776 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.354826 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} err="failed to get container status \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.354871 4883 scope.go:117] "RemoveContainer" containerID="ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355060 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355175 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8"} err="failed to get container status \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": rpc error: code = NotFound desc = could not find container \"ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8\": container with ID starting with ff5a74a75174a1a33486cec12d7bfa88eb22cd9d42f30b323b066b19b191f6c8 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355218 4883 scope.go:117] "RemoveContainer" containerID="5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.355878 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457"} err="failed to get container status \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": rpc error: code = NotFound desc = could not find container \"5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457\": container with ID starting with 5c3ffead360f542dbc3b857abd109891354fece821df5bc60404f67483001457 not found: ID does not exist" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.358686 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.358936 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.363354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.443877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.443940 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444012 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.444125 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546170 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546492 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546564 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546607 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.546654 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.547158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.552706 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.557912 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.560412 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.564497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"nova-metadata-0\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.631915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.768985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769435 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.769703 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") pod \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\" (UID: \"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05\") " Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.774555 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd" (OuterVolumeSpecName: "kube-api-access-5tvcd") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "kube-api-access-5tvcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.781220 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.785102 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts" (OuterVolumeSpecName: "scripts") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.790784 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.792824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data" (OuterVolumeSpecName: "config-data") pod "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" (UID: "ef281d66-a4a1-4a1a-b9d7-d6265b46ca05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873867 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tvcd\" (UniqueName: \"kubernetes.io/projected/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-kube-api-access-5tvcd\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873904 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873915 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:37 crc kubenswrapper[4883]: I0310 09:24:37.873925 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.091205 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b36d321f-f1b6-425e-abdf-61478d9ccf1a" path="/var/lib/kubelet/pods/b36d321f-f1b6-425e-abdf-61478d9ccf1a/volumes" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.092027 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18d7e6b-feee-4222-a8fd-c13c0c70db2a" path="/var/lib/kubelet/pods/d18d7e6b-feee-4222-a8fd-c13c0c70db2a/volumes" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.195504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: W0310 09:24:38.201286 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2300d23_b25a_4e0d_a695_7c11709bfcda.slice/crio-ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca WatchSource:0}: Error finding container ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca: Status 404 returned error can't find the container with id ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.276139 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.278102 4883 generic.go:334] "Generic (PLEG): container finished" podID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerID="1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3" exitCode=0 Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.278158 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerDied","Data":"1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280150 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s66f5" event={"ID":"ef281d66-a4a1-4a1a-b9d7-d6265b46ca05","Type":"ContainerDied","Data":"4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c"} Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280210 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4666671669cee93f50494e5681010f34c95c13f1664290315b5bce6c7f0d081c" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.280309 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s66f5" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.358097 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: E0310 09:24:38.358814 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.358836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.359023 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" containerName="nova-cell1-conductor-db-sync" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.359732 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.361464 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.384646 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493491 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.493650 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595446 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.595576 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.600634 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.601076 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b06d82-9f07-4c29-9bad-987d2c6d027c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.609763 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8n9t\" (UniqueName: \"kubernetes.io/projected/90b06d82-9f07-4c29-9bad-987d2c6d027c-kube-api-access-t8n9t\") pod \"nova-cell1-conductor-0\" (UID: \"90b06d82-9f07-4c29-9bad-987d2c6d027c\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:38 crc kubenswrapper[4883]: I0310 09:24:38.779888 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.185192 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:24:39 crc kubenswrapper[4883]: W0310 09:24:39.187848 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b06d82_9f07_4c29_9bad_987d2c6d027c.slice/crio-eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66 WatchSource:0}: Error finding container eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66: Status 404 returned error can't find the container with id eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66 Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.293999 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90b06d82-9f07-4c29-9bad-987d2c6d027c","Type":"ContainerStarted","Data":"eaf7cf2d83f4edaa00f3dd015a77894e6e02ae1414bba017094d3564d5386f66"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.296802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.296867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerStarted","Data":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.327662 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.327647916 podStartE2EDuration="2.327647916s" podCreationTimestamp="2026-03-10 09:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:39.317387498 +0000 UTC m=+1265.572285387" watchObservedRunningTime="2026-03-10 09:24:39.327647916 +0000 UTC m=+1265.582545806" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.579559 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.726578 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.726632 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.727326 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.727387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") pod \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\" (UID: \"3d3a7934-1ab2-4013-b3ff-90859ffcc179\") " Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.733283 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2" (OuterVolumeSpecName: "kube-api-access-hbkj2") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "kube-api-access-hbkj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.733649 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts" (OuterVolumeSpecName: "scripts") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.753236 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.753670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data" (OuterVolumeSpecName: "config-data") pod "3d3a7934-1ab2-4013-b3ff-90859ffcc179" (UID: "3d3a7934-1ab2-4013-b3ff-90859ffcc179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829312 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbkj2\" (UniqueName: \"kubernetes.io/projected/3d3a7934-1ab2-4013-b3ff-90859ffcc179-kube-api-access-hbkj2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829335 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829345 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:39 crc kubenswrapper[4883]: I0310 09:24:39.829389 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a7934-1ab2-4013-b3ff-90859ffcc179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.313557 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rptkb" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.313528 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rptkb" event={"ID":"3d3a7934-1ab2-4013-b3ff-90859ffcc179","Type":"ContainerDied","Data":"93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155"} Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.314103 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93bf15898cd55775fe1c641481b3b6079857f825b2b5abf0ebd10208dc8b1155" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.318991 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"90b06d82-9f07-4c29-9bad-987d2c6d027c","Type":"ContainerStarted","Data":"1c550759d0b078fae7cdd0a6a35ebfbeea86ecb02fac08bf2b4feb61ea95f4b9"} Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.320168 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.335426 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.33540928 podStartE2EDuration="2.33540928s" podCreationTimestamp="2026-03-10 09:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:40.332605022 +0000 UTC m=+1266.587502911" watchObservedRunningTime="2026-03-10 09:24:40.33540928 +0000 UTC m=+1266.590307169" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.478983 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.479290 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" containerID="cri-o://fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.479998 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" containerID="cri-o://cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.495230 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.495653 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" containerID="cri-o://fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" gracePeriod=30 Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.520806 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.859615 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.981011 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:40 crc kubenswrapper[4883]: I0310 09:24:40.990807 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" containerID="cri-o://cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" gracePeriod=10 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.182611 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333494 4883 generic.go:334] "Generic (PLEG): container finished" podID="47cb531f-d85b-41f0-9608-a19b158679c7" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" exitCode=0 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333553 4883 generic.go:334] "Generic (PLEG): container finished" podID="47cb531f-d85b-41f0-9608-a19b158679c7" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" exitCode=143 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333667 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333679 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47cb531f-d85b-41f0-9608-a19b158679c7","Type":"ContainerDied","Data":"267bcf024c91a240e0cf8abd61f08e6de1d6f0aad3c2fdd15bea1caa97a544cf"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333703 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.333911 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337340 4883 generic.go:334] "Generic (PLEG): container finished" podID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerID="cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" exitCode=0 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337624 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1"} Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.337913 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" containerID="cri-o://7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" gracePeriod=30 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.338104 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" containerID="cri-o://e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" gracePeriod=30 Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370187 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370378 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370490 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.370576 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") pod \"47cb531f-d85b-41f0-9608-a19b158679c7\" (UID: \"47cb531f-d85b-41f0-9608-a19b158679c7\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.371574 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs" (OuterVolumeSpecName: "logs") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.378563 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q" (OuterVolumeSpecName: "kube-api-access-dg92q") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "kube-api-access-dg92q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.384772 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.397733 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data" (OuterVolumeSpecName: "config-data") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.400984 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47cb531f-d85b-41f0-9608-a19b158679c7" (UID: "47cb531f-d85b-41f0-9608-a19b158679c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.416713 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.417293 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.417336 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} err="failed to get container status \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.417364 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.420839 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.420885 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} err="failed to get container status \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.420915 4883 scope.go:117] "RemoveContainer" containerID="cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421230 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2"} err="failed to get container status \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": rpc error: code = NotFound desc = could not find container \"cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2\": container with ID starting with cd2a05be80a4a7439817da41d7352f0e75fc9bd55a2802efeb13059b3c0a0dd2 not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421247 4883 scope.go:117] "RemoveContainer" containerID="fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.421509 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e"} err="failed to get container status \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": rpc error: code = NotFound desc = could not find container \"fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e\": container with ID starting with fc25bb2cd75ddf8f9908e275db6e3d03277105ec71df7670bb83b8544a3ce27e not found: ID does not exist" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.440096 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.476791 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.476889 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47cb531f-d85b-41f0-9608-a19b158679c7-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.477010 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cb531f-d85b-41f0-9608-a19b158679c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.477023 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg92q\" (UniqueName: \"kubernetes.io/projected/47cb531f-d85b-41f0-9608-a19b158679c7-kube-api-access-dg92q\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579325 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579799 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.579850 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580126 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.580228 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") pod \"fb652436-cf46-4a91-b358-f6c6a011cf43\" (UID: \"fb652436-cf46-4a91-b358-f6c6a011cf43\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.584253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc" (OuterVolumeSpecName: "kube-api-access-9h2jc") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "kube-api-access-9h2jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.621355 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.628649 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.632192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.633738 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.637637 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config" (OuterVolumeSpecName: "config") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.645462 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb652436-cf46-4a91-b358-f6c6a011cf43" (UID: "fb652436-cf46-4a91-b358-f6c6a011cf43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684465 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684508 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684521 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h2jc\" (UniqueName: \"kubernetes.io/projected/fb652436-cf46-4a91-b358-f6c6a011cf43-kube-api-access-9h2jc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684532 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684544 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.684552 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb652436-cf46-4a91-b358-f6c6a011cf43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.723564 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.728104 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.752833 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753292 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753308 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753318 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753325 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753338 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753343 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753365 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="init" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753373 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="init" Mar 10 09:24:41 crc kubenswrapper[4883]: E0310 09:24:41.753389 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753395 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753609 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" containerName="nova-manage" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753635 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-api" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753643 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" containerName="dnsmasq-dns" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.753663 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" containerName="nova-api-log" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.754650 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.759907 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.761102 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.878506 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.888392 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.888714 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.889042 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.889742 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991754 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991808 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991835 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.991937 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") pod \"c2300d23-b25a-4e0d-a695-7c11709bfcda\" (UID: \"c2300d23-b25a-4e0d-a695-7c11709bfcda\") " Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs" (OuterVolumeSpecName: "logs") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992511 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.992798 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2300d23-b25a-4e0d-a695-7c11709bfcda-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.993391 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.997882 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:41 crc kubenswrapper[4883]: I0310 09:24:41.998848 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.002548 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88" (OuterVolumeSpecName: "kube-api-access-f9f88") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "kube-api-access-f9f88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.010908 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"nova-api-0\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.020163 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.024431 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data" (OuterVolumeSpecName: "config-data") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.050646 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c2300d23-b25a-4e0d-a695-7c11709bfcda" (UID: "c2300d23-b25a-4e0d-a695-7c11709bfcda"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.077712 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095283 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095316 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9f88\" (UniqueName: \"kubernetes.io/projected/c2300d23-b25a-4e0d-a695-7c11709bfcda-kube-api-access-f9f88\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095330 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.095341 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2300d23-b25a-4e0d-a695-7c11709bfcda-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.109754 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47cb531f-d85b-41f0-9608-a19b158679c7" path="/var/lib/kubelet/pods/47cb531f-d85b-41f0-9608-a19b158679c7/volumes" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352786 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" exitCode=0 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353071 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" exitCode=143 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352877 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.352905 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353224 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353259 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2300d23-b25a-4e0d-a695-7c11709bfcda","Type":"ContainerDied","Data":"ffd12d2f9453cb1bc4150b06edfaa9bcf485473227f096bcbefd8dc8faea0eca"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.353279 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.361538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" event={"ID":"fb652436-cf46-4a91-b358-f6c6a011cf43","Type":"ContainerDied","Data":"f3dfd9c8abe53e2f4e70fd004e6457ef025d9a2c819617d0dfac05e54db79843"} Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.361720 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-mnnl8" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.382447 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.391794 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.419374 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.420467 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.420556 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} err="failed to get container status \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.420591 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.421824 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.421851 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} err="failed to get container status \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.421868 4883 scope.go:117] "RemoveContainer" containerID="e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422168 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af"} err="failed to get container status \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": rpc error: code = NotFound desc = could not find container \"e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af\": container with ID starting with e476b45e010955cd77ec678340f606dae4cf18c89d95b78d8ea14c5d1eacd4af not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422184 4883 scope.go:117] "RemoveContainer" containerID="7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422437 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff"} err="failed to get container status \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": rpc error: code = NotFound desc = could not find container \"7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff\": container with ID starting with 7e7caf9a1a8441e8398c0545cdb8236d791459eab880423c09d0e1b820cba2ff not found: ID does not exist" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.422469 4883 scope.go:117] "RemoveContainer" containerID="cc94e4a4d546d543f44873ce6a1ec9593535c80494c566268aaa2c5b62a424b1" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.427280 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.445164 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.445872 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.445891 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: E0310 09:24:42.446127 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446139 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446460 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-metadata" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.446536 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" containerName="nova-metadata-log" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.448529 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.450836 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.454053 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.455609 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.455977 4883 scope.go:117] "RemoveContainer" containerID="8dc17cfa98b4f413422ff6ec7b4debd0ca6ed29db8f51bb73e604fc0c8aedd72" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.466166 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-mnnl8"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.471923 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.487572 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:42 crc kubenswrapper[4883]: W0310 09:24:42.489713 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34d69a2_fd0d_42e4_942f_178dbf2c1b55.slice/crio-b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62 WatchSource:0}: Error finding container b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62: Status 404 returned error can't find the container with id b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62 Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609237 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609495 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609535 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.609578 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712198 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712261 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712300 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712323 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.712353 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.713394 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.716612 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.717389 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.717828 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.729005 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"nova-metadata-0\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " pod="openstack/nova-metadata-0" Mar 10 09:24:42 crc kubenswrapper[4883]: I0310 09:24:42.766414 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378209 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.378514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerStarted","Data":"b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62"} Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.407297 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.407278218 podStartE2EDuration="2.407278218s" podCreationTimestamp="2026-03-10 09:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:43.394250541 +0000 UTC m=+1269.649148420" watchObservedRunningTime="2026-03-10 09:24:43.407278218 +0000 UTC m=+1269.662176106" Mar 10 09:24:43 crc kubenswrapper[4883]: I0310 09:24:43.791864 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:43 crc kubenswrapper[4883]: W0310 09:24:43.796581 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398e71db_8c97_477b_b92c_35829f9b7dee.slice/crio-f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888 WatchSource:0}: Error finding container f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888: Status 404 returned error can't find the container with id f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888 Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.104926 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2300d23-b25a-4e0d-a695-7c11709bfcda" path="/var/lib/kubelet/pods/c2300d23-b25a-4e0d-a695-7c11709bfcda/volumes" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.114245 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb652436-cf46-4a91-b358-f6c6a011cf43" path="/var/lib/kubelet/pods/fb652436-cf46-4a91-b358-f6c6a011cf43/volumes" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.391542 4883 generic.go:334] "Generic (PLEG): container finished" podID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerID="fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" exitCode=0 Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.391640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerDied","Data":"fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399719 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399768 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.399784 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerStarted","Data":"f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888"} Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.428277 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.428255633 podStartE2EDuration="2.428255633s" podCreationTimestamp="2026-03-10 09:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:44.42183182 +0000 UTC m=+1270.676729729" watchObservedRunningTime="2026-03-10 09:24:44.428255633 +0000 UTC m=+1270.683153521" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.563333 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660348 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660439 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.660601 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") pod \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\" (UID: \"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32\") " Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.665561 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2" (OuterVolumeSpecName: "kube-api-access-qcsm2") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "kube-api-access-qcsm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.683796 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data" (OuterVolumeSpecName: "config-data") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.684354 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" (UID: "0e8e02c2-3c36-440a-b7aa-d39b27f3bd32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763845 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763874 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcsm2\" (UniqueName: \"kubernetes.io/projected/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-kube-api-access-qcsm2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:44 crc kubenswrapper[4883]: I0310 09:24:44.763890 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412223 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e8e02c2-3c36-440a-b7aa-d39b27f3bd32","Type":"ContainerDied","Data":"3df63ff82b84df0c9c734c9449801557a1a4dedf62a22952150b25c471cef32b"} Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412312 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.412355 4883 scope.go:117] "RemoveContainer" containerID="fda8a7af1c52f69233e2ff5b9090007030a88af5530808dd1c1ae5e86ac05d25" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.456309 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.466126 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.473519 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: E0310 09:24:45.474163 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.474227 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.474492 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" containerName="nova-scheduler-scheduler" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.475437 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.479435 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.485961 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578565 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578743 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.578910 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680439 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680515 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.680613 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.686277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.686814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.697466 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"nova-scheduler-0\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " pod="openstack/nova-scheduler-0" Mar 10 09:24:45 crc kubenswrapper[4883]: I0310 09:24:45.796274 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.091328 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8e02c2-3c36-440a-b7aa-d39b27f3bd32" path="/var/lib/kubelet/pods/0e8e02c2-3c36-440a-b7aa-d39b27f3bd32/volumes" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.197599 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:46 crc kubenswrapper[4883]: W0310 09:24:46.198062 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ece3b9_2a8b_4cfd_b78c_09adc594ac3b.slice/crio-e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da WatchSource:0}: Error finding container e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da: Status 404 returned error can't find the container with id e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.426219 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerStarted","Data":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.427008 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerStarted","Data":"e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da"} Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.449162 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.449138649 podStartE2EDuration="1.449138649s" podCreationTimestamp="2026-03-10 09:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:46.443393435 +0000 UTC m=+1272.698291325" watchObservedRunningTime="2026-03-10 09:24:46.449138649 +0000 UTC m=+1272.704036537" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.633196 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:46 crc kubenswrapper[4883]: I0310 09:24:46.655365 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.459738 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.767645 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:24:47 crc kubenswrapper[4883]: I0310 09:24:47.767734 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:24:48 crc kubenswrapper[4883]: I0310 09:24:48.807226 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.219679 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.221276 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.223229 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.223490 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.237973 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.370554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.370919 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.371166 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.371558 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.473810 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.473893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.474055 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.474112 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.481345 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.482229 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.482611 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.487502 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"nova-cell1-cell-mapping-j9wml\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.540380 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:49 crc kubenswrapper[4883]: I0310 09:24:49.948354 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:24:49 crc kubenswrapper[4883]: W0310 09:24:49.950404 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf096652_ae85_4c98_8821_cd47eafae98f.slice/crio-b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56 WatchSource:0}: Error finding container b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56: Status 404 returned error can't find the container with id b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56 Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.487315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerStarted","Data":"354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed"} Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.487692 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerStarted","Data":"b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56"} Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.517280 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-j9wml" podStartSLOduration=1.5172581539999999 podStartE2EDuration="1.517258154s" podCreationTimestamp="2026-03-10 09:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:24:50.505093076 +0000 UTC m=+1276.759990965" watchObservedRunningTime="2026-03-10 09:24:50.517258154 +0000 UTC m=+1276.772156044" Mar 10 09:24:50 crc kubenswrapper[4883]: I0310 09:24:50.796854 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.078709 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.079140 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.767883 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:24:52 crc kubenswrapper[4883]: I0310 09:24:52.767941 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.119688 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.161329 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.782699 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:53 crc kubenswrapper[4883]: I0310 09:24:53.782705 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:24:54 crc kubenswrapper[4883]: I0310 09:24:54.525395 4883 generic.go:334] "Generic (PLEG): container finished" podID="cf096652-ae85-4c98-8821-cd47eafae98f" containerID="354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed" exitCode=0 Mar 10 09:24:54 crc kubenswrapper[4883]: I0310 09:24:54.525486 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerDied","Data":"354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed"} Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.365285 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.797269 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.819904 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.832216 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901759 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901896 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.901979 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.902042 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") pod \"cf096652-ae85-4c98-8821-cd47eafae98f\" (UID: \"cf096652-ae85-4c98-8821-cd47eafae98f\") " Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.908966 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2" (OuterVolumeSpecName: "kube-api-access-t9dg2") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "kube-api-access-t9dg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.911607 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts" (OuterVolumeSpecName: "scripts") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.928141 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:55 crc kubenswrapper[4883]: I0310 09:24:55.932712 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data" (OuterVolumeSpecName: "config-data") pod "cf096652-ae85-4c98-8821-cd47eafae98f" (UID: "cf096652-ae85-4c98-8821-cd47eafae98f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003063 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003089 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dg2\" (UniqueName: \"kubernetes.io/projected/cf096652-ae85-4c98-8821-cd47eafae98f-kube-api-access-t9dg2\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003102 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.003111 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf096652-ae85-4c98-8821-cd47eafae98f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548640 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-j9wml" event={"ID":"cf096652-ae85-4c98-8821-cd47eafae98f","Type":"ContainerDied","Data":"b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56"} Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548990 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49e06ef6cb80ccba7934e35c1c39b14ac278a363b273dc9d1720347e03cdc56" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.548679 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-j9wml" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.583034 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720489 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720790 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" containerID="cri-o://162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.720875 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" containerID="cri-o://9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745423 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745666 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" containerID="cri-o://d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.745780 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" containerID="cri-o://903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" gracePeriod=30 Mar 10 09:24:56 crc kubenswrapper[4883]: I0310 09:24:56.991969 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.558073 4883 generic.go:334] "Generic (PLEG): container finished" podID="398e71db-8c97-477b-b92c-35829f9b7dee" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" exitCode=143 Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.558143 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.560124 4883 generic.go:334] "Generic (PLEG): container finished" podID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" exitCode=143 Mar 10 09:24:57 crc kubenswrapper[4883]: I0310 09:24:57.560213 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.406202 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.406442 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" containerID="cri-o://c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" gracePeriod=30 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571595 4883 generic.go:334] "Generic (PLEG): container finished" podID="5094e588-6ef7-4214-a96e-26d75ad98977" containerID="c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" exitCode=2 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571690 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerDied","Data":"c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b"} Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.571859 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" containerID="cri-o://6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" gracePeriod=30 Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.823351 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.856772 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") pod \"5094e588-6ef7-4214-a96e-26d75ad98977\" (UID: \"5094e588-6ef7-4214-a96e-26d75ad98977\") " Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.877163 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr" (OuterVolumeSpecName: "kube-api-access-k86gr") pod "5094e588-6ef7-4214-a96e-26d75ad98977" (UID: "5094e588-6ef7-4214-a96e-26d75ad98977"). InnerVolumeSpecName "kube-api-access-k86gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:58 crc kubenswrapper[4883]: I0310 09:24:58.959427 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86gr\" (UniqueName: \"kubernetes.io/projected/5094e588-6ef7-4214-a96e-26d75ad98977-kube-api-access-k86gr\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5094e588-6ef7-4214-a96e-26d75ad98977","Type":"ContainerDied","Data":"8cea65d53e3ef54f53f5028aee0f66a936c83a86f17bc4112a8c38018507f5cd"} Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586408 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.586416 4883 scope.go:117] "RemoveContainer" containerID="c0101a4659f1590e10ca756263b45be2c4afc16ddc5690114bf2e3350ef9322b" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.644857 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.659325 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.662046 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: E0310 09:24:59.662961 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.662986 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: E0310 09:24:59.663013 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663021 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663196 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" containerName="kube-state-metrics" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663218 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" containerName="nova-manage" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.663921 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.669885 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.670253 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.674100 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777007 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777054 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777132 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.777224 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879808 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879911 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.879983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.880014 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886228 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886288 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.886972 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c373dd-952a-4305-82ed-1d047c7a859f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.895026 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7c5\" (UniqueName: \"kubernetes.io/projected/39c373dd-952a-4305-82ed-1d047c7a859f-kube-api-access-rp7c5\") pod \"kube-state-metrics-0\" (UID: \"39c373dd-952a-4305-82ed-1d047c7a859f\") " pod="openstack/kube-state-metrics-0" Mar 10 09:24:59 crc kubenswrapper[4883]: I0310 09:24:59.991468 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.079794 4883 scope.go:117] "RemoveContainer" containerID="e20d3f6d5f3aae231c536075cd1098cf482fcd5c0cc1095b975e4d04ba285b0b" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.099301 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5094e588-6ef7-4214-a96e-26d75ad98977" path="/var/lib/kubelet/pods/5094e588-6ef7-4214-a96e-26d75ad98977/volumes" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.147313 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.147616 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" containerID="cri-o://1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148074 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" containerID="cri-o://ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148136 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" containerID="cri-o://3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.148181 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" containerID="cri-o://aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" gracePeriod=30 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.254224 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.328807 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.388950 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389052 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389170 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389198 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") pod \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\" (UID: \"f34d69a2-fd0d-42e4-942f-178dbf2c1b55\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.389605 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs" (OuterVolumeSpecName: "logs") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.391761 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.396099 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5" (OuterVolumeSpecName: "kube-api-access-rfvw5") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "kube-api-access-rfvw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.414133 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.431866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data" (OuterVolumeSpecName: "config-data") pod "f34d69a2-fd0d-42e4-942f-178dbf2c1b55" (UID: "f34d69a2-fd0d-42e4-942f-178dbf2c1b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: W0310 09:25:00.479229 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c373dd_952a_4305_82ed_1d047c7a859f.slice/crio-501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9 WatchSource:0}: Error finding container 501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9: Status 404 returned error can't find the container with id 501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.481532 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.493966 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494653 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494692 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.494748 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") pod \"398e71db-8c97-477b-b92c-35829f9b7dee\" (UID: \"398e71db-8c97-477b-b92c-35829f9b7dee\") " Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495613 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495661 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfvw5\" (UniqueName: \"kubernetes.io/projected/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-kube-api-access-rfvw5\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.495674 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34d69a2-fd0d-42e4-942f-178dbf2c1b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.496147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs" (OuterVolumeSpecName: "logs") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.502404 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s" (OuterVolumeSpecName: "kube-api-access-j6k6s") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "kube-api-access-j6k6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.516087 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.521803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data" (OuterVolumeSpecName: "config-data") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.534882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "398e71db-8c97-477b-b92c-35829f9b7dee" (UID: "398e71db-8c97-477b-b92c-35829f9b7dee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598876 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598913 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598930 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6k6s\" (UniqueName: \"kubernetes.io/projected/398e71db-8c97-477b-b92c-35829f9b7dee-kube-api-access-j6k6s\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598944 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/398e71db-8c97-477b-b92c-35829f9b7dee-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.598955 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/398e71db-8c97-477b-b92c-35829f9b7dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600137 4883 generic.go:334] "Generic (PLEG): container finished" podID="398e71db-8c97-477b-b92c-35829f9b7dee" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600215 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600238 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600317 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"398e71db-8c97-477b-b92c-35829f9b7dee","Type":"ContainerDied","Data":"f719cd0714eff1ee6766787e51cf9b5c14efc5a4992031e0aeeeb87d03232888"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.600362 4883 scope.go:117] "RemoveContainer" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.602598 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39c373dd-952a-4305-82ed-1d047c7a859f","Type":"ContainerStarted","Data":"501a3f53c407e29c3c2fa4c2789e88141b465b69eb039eedcd5031106e79d6d9"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605121 4883 generic.go:334] "Generic (PLEG): container finished" podID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605227 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605235 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.605301 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34d69a2-fd0d-42e4-942f-178dbf2c1b55","Type":"ContainerDied","Data":"b39a955be44f48c7e8bcca14deb268540526603ce0e5d25ae8ee8ee9e00bbf62"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.610978 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611076 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" exitCode=2 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611086 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" exitCode=0 Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611110 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611145 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.611159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.645117 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.659363 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.673904 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.683360 4883 scope.go:117] "RemoveContainer" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686360 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686831 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686851 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686873 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686879 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686908 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686915 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.686932 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.686938 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687106 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687126 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-api" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687138 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" containerName="nova-metadata-metadata" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.687149 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" containerName="nova-api-log" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.688105 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.690492 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.708317 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725576 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725752 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.725778 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.727506 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.727585 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.729059 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.747607 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.749865 4883 scope.go:117] "RemoveContainer" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.750265 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": container with ID starting with 903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f not found: ID does not exist" containerID="903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750310 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f"} err="failed to get container status \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": rpc error: code = NotFound desc = could not find container \"903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f\": container with ID starting with 903cb097669cce6d96e9c00b3b399b553bd45d30ba45c483081d967fe814208f not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750336 4883 scope.go:117] "RemoveContainer" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.750850 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": container with ID starting with d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98 not found: ID does not exist" containerID="d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750881 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98"} err="failed to get container status \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": rpc error: code = NotFound desc = could not find container \"d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98\": container with ID starting with d152a8bafe3974896e0b6ed06c8d51430640fd0fce5d05ddb8cd14d3ece0fe98 not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.750905 4883 scope.go:117] "RemoveContainer" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.758537 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.760377 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.762217 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.766508 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.773118 4883 scope.go:117] "RemoveContainer" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.789230 4883 scope.go:117] "RemoveContainer" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.790235 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": container with ID starting with 9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8 not found: ID does not exist" containerID="9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790283 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8"} err="failed to get container status \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": rpc error: code = NotFound desc = could not find container \"9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8\": container with ID starting with 9b8c8456881851870a8df3cf05aa509cc9b6d2c2cb2998c2c0ed3d989f9a6dc8 not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790314 4883 scope.go:117] "RemoveContainer" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.790686 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": container with ID starting with 162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a not found: ID does not exist" containerID="162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.790720 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a"} err="failed to get container status \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": rpc error: code = NotFound desc = could not find container \"162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a\": container with ID starting with 162d6bd1c80c8d38642762552c3cc6f1373759b123cd4e9cb2202eeb6f91af6a not found: ID does not exist" Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.798236 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.799941 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.801306 4883 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:25:00 crc kubenswrapper[4883]: E0310 09:25:00.801344 4883 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828526 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828571 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828642 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828711 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828789 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828831 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.828898 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829127 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829302 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-logs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.829407 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.832122 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-config-data\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.833943 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.834393 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.847909 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5zn\" (UniqueName: \"kubernetes.io/projected/0743bd84-b1d5-4634-9a7f-2c9daf2a5994-kube-api-access-kc5zn\") pod \"nova-metadata-0\" (UID: \"0743bd84-b1d5-4634-9a7f-2c9daf2a5994\") " pod="openstack/nova-metadata-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930322 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930430 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930502 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.930554 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.931369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.934448 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.935771 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:00 crc kubenswrapper[4883]: I0310 09:25:00.944892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"nova-api-0\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " pod="openstack/nova-api-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.033081 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.074198 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:01 crc kubenswrapper[4883]: W0310 09:25:01.503838 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0743bd84_b1d5_4634_9a7f_2c9daf2a5994.slice/crio-b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315 WatchSource:0}: Error finding container b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315: Status 404 returned error can't find the container with id b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315 Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.505376 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.555890 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:01 crc kubenswrapper[4883]: W0310 09:25:01.562783 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f69269f_4be5_4302_b2ad_8f38012ef305.slice/crio-f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0 WatchSource:0}: Error finding container f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0: Status 404 returned error can't find the container with id f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0 Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.624062 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39c373dd-952a-4305-82ed-1d047c7a859f","Type":"ContainerStarted","Data":"2c87402ad2966ba02b67f901907c0989443a467ee915fa75f74aa0bd8d1b8283"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.624517 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.627710 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"b89a274b6b99b40e5c7418d9be1c288fba7e8c8f8b645b4ae771621c50a66315"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.631613 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0"} Mar 10 09:25:01 crc kubenswrapper[4883]: I0310 09:25:01.652009 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.354639689 podStartE2EDuration="2.651991424s" podCreationTimestamp="2026-03-10 09:24:59 +0000 UTC" firstStartedPulling="2026-03-10 09:25:00.482299674 +0000 UTC m=+1286.737197563" lastFinishedPulling="2026-03-10 09:25:00.779651408 +0000 UTC m=+1287.034549298" observedRunningTime="2026-03-10 09:25:01.640820079 +0000 UTC m=+1287.895717968" watchObservedRunningTime="2026-03-10 09:25:01.651991424 +0000 UTC m=+1287.906889314" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.089448 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398e71db-8c97-477b-b92c-35829f9b7dee" path="/var/lib/kubelet/pods/398e71db-8c97-477b-b92c-35829f9b7dee/volumes" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.090069 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34d69a2-fd0d-42e4-942f-178dbf2c1b55" path="/var/lib/kubelet/pods/f34d69a2-fd0d-42e4-942f-178dbf2c1b55/volumes" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.283520 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463270 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463334 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.463374 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") pod \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\" (UID: \"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b\") " Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.469351 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78" (OuterVolumeSpecName: "kube-api-access-xvl78") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "kube-api-access-xvl78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.493809 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.494906 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data" (OuterVolumeSpecName: "config-data") pod "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" (UID: "36ece3b9-2a8b-4cfd-b78c-09adc594ac3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565927 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvl78\" (UniqueName: \"kubernetes.io/projected/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-kube-api-access-xvl78\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565957 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.565967 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.650912 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.650982 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerStarted","Data":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654177 4883 generic.go:334] "Generic (PLEG): container finished" podID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" exitCode=0 Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654232 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654230 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerDied","Data":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654373 4883 scope.go:117] "RemoveContainer" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.654587 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"36ece3b9-2a8b-4cfd-b78c-09adc594ac3b","Type":"ContainerDied","Data":"e02bf1bb21736ce1a051604e9980924295d62264239f30728f6dd60f080541da"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.656643 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"5d21d33e3fa85744f491a32aabda12b4855841d4a5e55770849f877479307046"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.656680 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0743bd84-b1d5-4634-9a7f-2c9daf2a5994","Type":"ContainerStarted","Data":"e290a0fe114ae46c4e34a57982c5ccdd3bc36a159ca8d0ca9f2bee7d317ec4a7"} Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.677358 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.677342776 podStartE2EDuration="2.677342776s" podCreationTimestamp="2026-03-10 09:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:02.672763332 +0000 UTC m=+1288.927661221" watchObservedRunningTime="2026-03-10 09:25:02.677342776 +0000 UTC m=+1288.932240666" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.680919 4883 scope.go:117] "RemoveContainer" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: E0310 09:25:02.681328 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": container with ID starting with 6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5 not found: ID does not exist" containerID="6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.681408 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5"} err="failed to get container status \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": rpc error: code = NotFound desc = could not find container \"6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5\": container with ID starting with 6840039f49e5a7189c72e8df49d0cc9ed231f15b2e7a39fdaa5e884d5790dcf5 not found: ID does not exist" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.696022 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.696007272 podStartE2EDuration="2.696007272s" podCreationTimestamp="2026-03-10 09:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:02.690498183 +0000 UTC m=+1288.945396073" watchObservedRunningTime="2026-03-10 09:25:02.696007272 +0000 UTC m=+1288.950905161" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.712298 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.722470 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730135 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: E0310 09:25:02.730662 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730687 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.730919 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" containerName="nova-scheduler-scheduler" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.731649 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.733544 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.738938 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872158 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.872202 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.973661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.974019 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.974063 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.978567 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.979671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626b3115-ced1-45ea-8401-e2bd7e79a20c-config-data\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:02 crc kubenswrapper[4883]: I0310 09:25:02.989729 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265bk\" (UniqueName: \"kubernetes.io/projected/626b3115-ced1-45ea-8401-e2bd7e79a20c-kube-api-access-265bk\") pod \"nova-scheduler-0\" (UID: \"626b3115-ced1-45ea-8401-e2bd7e79a20c\") " pod="openstack/nova-scheduler-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.050770 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.204230 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383834 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.383983 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384038 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384144 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.384169 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") pod \"13fc6b71-b633-4726-ad0d-91a04b592d3b\" (UID: \"13fc6b71-b633-4726-ad0d-91a04b592d3b\") " Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.386783 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.388218 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.393654 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25" (OuterVolumeSpecName: "kube-api-access-bpz25") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "kube-api-access-bpz25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.399898 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts" (OuterVolumeSpecName: "scripts") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.416543 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.451666 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.470740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data" (OuterVolumeSpecName: "config-data") pod "13fc6b71-b633-4726-ad0d-91a04b592d3b" (UID: "13fc6b71-b633-4726-ad0d-91a04b592d3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494605 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494646 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494659 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494670 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494678 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/13fc6b71-b633-4726-ad0d-91a04b592d3b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494687 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13fc6b71-b633-4726-ad0d-91a04b592d3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.494696 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpz25\" (UniqueName: \"kubernetes.io/projected/13fc6b71-b633-4726-ad0d-91a04b592d3b-kube-api-access-bpz25\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.500994 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: W0310 09:25:03.501511 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626b3115_ced1_45ea_8401_e2bd7e79a20c.slice/crio-6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab WatchSource:0}: Error finding container 6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab: Status 404 returned error can't find the container with id 6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.666781 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"626b3115-ced1-45ea-8401-e2bd7e79a20c","Type":"ContainerStarted","Data":"98df58f7e20401aef58c2be9f2ce9527fe59a42792b1b41f8df8033469fd8ae0"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.667122 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"626b3115-ced1-45ea-8401-e2bd7e79a20c","Type":"ContainerStarted","Data":"6903f2284adfe3d3184cd96a9760058fc2a0a242254e6d1376e5c90116cd44ab"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670296 4883 generic.go:334] "Generic (PLEG): container finished" podID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" exitCode=0 Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670367 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670501 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"13fc6b71-b633-4726-ad0d-91a04b592d3b","Type":"ContainerDied","Data":"e06324548e387289ee5784fe269a4d67cc71d75aeed67c453f124447b142b7b8"} Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.670527 4883 scope.go:117] "RemoveContainer" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.695635 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.695569215 podStartE2EDuration="1.695569215s" podCreationTimestamp="2026-03-10 09:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:03.682002091 +0000 UTC m=+1289.936899980" watchObservedRunningTime="2026-03-10 09:25:03.695569215 +0000 UTC m=+1289.950467103" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.709503 4883 scope.go:117] "RemoveContainer" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.719586 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.729786 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.734394 4883 scope.go:117] "RemoveContainer" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.740402 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744002 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744027 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744043 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744120 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744371 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744387 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.744403 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744419 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744612 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-notification-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744634 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="ceilometer-central-agent" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744641 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="sg-core" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.744654 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" containerName="proxy-httpd" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.746695 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.748675 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.751723 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.751798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.753886 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.757933 4883 scope.go:117] "RemoveContainer" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.782705 4883 scope.go:117] "RemoveContainer" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783128 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": container with ID starting with ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb not found: ID does not exist" containerID="ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783160 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb"} err="failed to get container status \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": rpc error: code = NotFound desc = could not find container \"ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb\": container with ID starting with ef0920abc2550d020abfa828f97715bb0a8c461d97845733ae9fb604af6739bb not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783184 4883 scope.go:117] "RemoveContainer" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783580 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": container with ID starting with 3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96 not found: ID does not exist" containerID="3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783619 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96"} err="failed to get container status \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": rpc error: code = NotFound desc = could not find container \"3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96\": container with ID starting with 3b2cc664a25a60cfe30f7e90b1c211ee22c17e926b5d8587d1c5fb5bde214b96 not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783640 4883 scope.go:117] "RemoveContainer" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.783909 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": container with ID starting with aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830 not found: ID does not exist" containerID="aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783940 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830"} err="failed to get container status \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": rpc error: code = NotFound desc = could not find container \"aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830\": container with ID starting with aa44892079db2dfa1d4a3347ac7e9fbb5538f7a5e4652d0c6020ec2414c93830 not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.783954 4883 scope.go:117] "RemoveContainer" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: E0310 09:25:03.784404 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": container with ID starting with 1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e not found: ID does not exist" containerID="1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.784447 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e"} err="failed to get container status \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": rpc error: code = NotFound desc = could not find container \"1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e\": container with ID starting with 1a182854d82862bb5d69684eab105f2235eb91d761e7fe7168f8b9f85ad5047e not found: ID does not exist" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908011 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908071 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908120 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908147 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908280 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908530 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:03 crc kubenswrapper[4883]: I0310 09:25:03.908688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010566 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010609 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.010666 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.011187 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.011849 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012003 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012348 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012726 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.012797 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.016336 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.016344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017145 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017320 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.017616 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.031032 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ceilometer-0\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.061838 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.095566 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13fc6b71-b633-4726-ad0d-91a04b592d3b" path="/var/lib/kubelet/pods/13fc6b71-b633-4726-ad0d-91a04b592d3b/volumes" Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.096339 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ece3b9-2a8b-4cfd-b78c-09adc594ac3b" path="/var/lib/kubelet/pods/36ece3b9-2a8b-4cfd-b78c-09adc594ac3b/volumes" Mar 10 09:25:04 crc kubenswrapper[4883]: W0310 09:25:04.475272 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae500ef3_e9e8_490e_863f_7768270829a6.slice/crio-479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3 WatchSource:0}: Error finding container 479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3: Status 404 returned error can't find the container with id 479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3 Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.479546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:04 crc kubenswrapper[4883]: I0310 09:25:04.685237 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3"} Mar 10 09:25:05 crc kubenswrapper[4883]: I0310 09:25:05.699592 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.033408 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.033468 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.710585 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} Mar 10 09:25:06 crc kubenswrapper[4883]: I0310 09:25:06.710857 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.052681 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.731186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerStarted","Data":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.731702 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:08 crc kubenswrapper[4883]: I0310 09:25:08.755914 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.912326925 podStartE2EDuration="5.755892238s" podCreationTimestamp="2026-03-10 09:25:03 +0000 UTC" firstStartedPulling="2026-03-10 09:25:04.478511096 +0000 UTC m=+1290.733408985" lastFinishedPulling="2026-03-10 09:25:08.322076409 +0000 UTC m=+1294.576974298" observedRunningTime="2026-03-10 09:25:08.749743553 +0000 UTC m=+1295.004641443" watchObservedRunningTime="2026-03-10 09:25:08.755892238 +0000 UTC m=+1295.010790117" Mar 10 09:25:10 crc kubenswrapper[4883]: I0310 09:25:10.003381 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.033689 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.035302 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.074732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:11 crc kubenswrapper[4883]: I0310 09:25:11.074789 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.047603 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0743bd84-b1d5-4634-9a7f-2c9daf2a5994" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.047623 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0743bd84-b1d5-4634-9a7f-2c9daf2a5994" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.157648 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:12 crc kubenswrapper[4883]: I0310 09:25:12.157688 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.050853 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.076512 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 09:25:13 crc kubenswrapper[4883]: I0310 09:25:13.802448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.037644 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.038769 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.043748 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.078628 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.079209 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.081164 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.083794 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.863836 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.867876 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:21 crc kubenswrapper[4883]: I0310 09:25:21.868551 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.020078 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.021454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.059094 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061231 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061300 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061405 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061547 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.061622 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.163776 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.163983 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164017 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164052 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164079 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164726 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.164885 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.165214 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.165454 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.169497 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.202141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"dnsmasq-dns-7749c44969-gf7ng\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.345551 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.761764 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:25:22 crc kubenswrapper[4883]: I0310 09:25:22.873867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerStarted","Data":"bc8f46ec7a59322161bc14068b60976298d60f1cfa73a1b7887e26a2e987b797"} Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.800201 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801454 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" containerID="cri-o://05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801578 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" containerID="cri-o://7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801511 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" containerID="cri-o://da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.801533 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" containerID="cri-o://a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" gracePeriod=30 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.814911 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.211:3000/\": EOF" Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.884093 4883 generic.go:334] "Generic (PLEG): container finished" podID="3612d60a-476b-48fa-9163-03c2886a64b2" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" exitCode=0 Mar 10 09:25:23 crc kubenswrapper[4883]: I0310 09:25:23.884200 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.096295 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.632386 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718847 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718918 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.718951 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719001 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719292 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719321 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.719366 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") pod \"ae500ef3-e9e8-490e-863f-7768270829a6\" (UID: \"ae500ef3-e9e8-490e-863f-7768270829a6\") " Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.720278 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.720552 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.725549 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts" (OuterVolumeSpecName: "scripts") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.725691 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq" (OuterVolumeSpecName: "kube-api-access-mr2bq") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "kube-api-access-mr2bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.742957 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.759129 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.779804 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822616 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822657 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae500ef3-e9e8-490e-863f-7768270829a6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822667 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822680 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822691 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr2bq\" (UniqueName: \"kubernetes.io/projected/ae500ef3-e9e8-490e-863f-7768270829a6-kube-api-access-mr2bq\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822700 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.822708 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.828577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data" (OuterVolumeSpecName: "config-data") pod "ae500ef3-e9e8-490e-863f-7768270829a6" (UID: "ae500ef3-e9e8-490e-863f-7768270829a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.896403 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerStarted","Data":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.897045 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899588 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899623 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" exitCode=2 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899657 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899666 4883 generic.go:334] "Generic (PLEG): container finished" podID="ae500ef3-e9e8-490e-863f-7768270829a6" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" exitCode=0 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899671 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899664 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899841 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899865 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899878 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899888 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ae500ef3-e9e8-490e-863f-7768270829a6","Type":"ContainerDied","Data":"479fed75a96312b4631f3fcfa8a25bf62900caf4553cdbb06ce55d08736d68c3"} Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.899906 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.900158 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" containerID="cri-o://8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" gracePeriod=30 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.900413 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" containerID="cri-o://4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" gracePeriod=30 Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.924015 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae500ef3-e9e8-490e-863f-7768270829a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.927571 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" podStartSLOduration=3.927560458 podStartE2EDuration="3.927560458s" podCreationTimestamp="2026-03-10 09:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:24.921468511 +0000 UTC m=+1311.176366410" watchObservedRunningTime="2026-03-10 09:25:24.927560458 +0000 UTC m=+1311.182458347" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.929656 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.949110 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.950371 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.961796 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.972890 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973427 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973450 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973468 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973491 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973511 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973518 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: E0310 09:25:24.973528 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973534 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973776 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="proxy-httpd" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973798 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-notification-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973807 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="sg-core" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.973823 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" containerName="ceilometer-central-agent" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.975644 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.976629 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.981336 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.981681 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.982006 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.991906 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:24 crc kubenswrapper[4883]: I0310 09:25:24.999780 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.000429 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.000462 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.000603 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001036 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001071 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001097 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001373 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001410 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001437 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: E0310 09:25:25.001730 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001754 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001767 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001956 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.001989 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.002947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.002969 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003174 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003202 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003407 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.003425 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.004639 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.004664 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012528 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012563 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012941 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.012988 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017053 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017098 4883 scope.go:117] "RemoveContainer" containerID="da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017551 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1"} err="failed to get container status \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": rpc error: code = NotFound desc = could not find container \"da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1\": container with ID starting with da69886b557144229a409671ce702be240fbe5ebacf833b3014e9a4bed1639a1 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017581 4883 scope.go:117] "RemoveContainer" containerID="a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017863 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95"} err="failed to get container status \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": rpc error: code = NotFound desc = could not find container \"a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95\": container with ID starting with a633c718c63f8697432538452932e0ad32178d27c114bfa34e1f04e14558dd95 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.017886 4883 scope.go:117] "RemoveContainer" containerID="7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018287 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4"} err="failed to get container status \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": rpc error: code = NotFound desc = could not find container \"7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4\": container with ID starting with 7d63cb3d9829d948d402ee805ed710e484d581efd2efce1af4419971c9fb3cb4 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018335 4883 scope.go:117] "RemoveContainer" containerID="05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.018737 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10"} err="failed to get container status \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": rpc error: code = NotFound desc = could not find container \"05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10\": container with ID starting with 05014d2513f024af40a7ec29ec4f547a575f9f1790424f11803edaec30071b10 not found: ID does not exist" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128654 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128806 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128847 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128968 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.128999 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.129033 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.129216 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.231598 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232378 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232465 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232562 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232594 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232668 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232693 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.232719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.233295 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.234723 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.239179 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.239369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.240073 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.240210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.244797 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.249141 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"ceilometer-0\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.305670 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.542078 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.712077 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:25 crc kubenswrapper[4883]: W0310 09:25:25.713687 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod718b65b8_f5c7_4933_945a_8e5e5dea72a4.slice/crio-348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed WatchSource:0}: Error finding container 348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed: Status 404 returned error can't find the container with id 348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.915331 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.915228 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" exitCode=143 Mar 10 09:25:25 crc kubenswrapper[4883]: I0310 09:25:25.917625 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed"} Mar 10 09:25:26 crc kubenswrapper[4883]: I0310 09:25:26.104908 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae500ef3-e9e8-490e-863f-7768270829a6" path="/var/lib/kubelet/pods/ae500ef3-e9e8-490e-863f-7768270829a6/volumes" Mar 10 09:25:26 crc kubenswrapper[4883]: I0310 09:25:26.927505 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} Mar 10 09:25:27 crc kubenswrapper[4883]: I0310 09:25:27.943584 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} Mar 10 09:25:27 crc kubenswrapper[4883]: I0310 09:25:27.943860 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.443573 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.608869 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609019 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609071 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.609236 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") pod \"6f69269f-4be5-4302-b2ad-8f38012ef305\" (UID: \"6f69269f-4be5-4302-b2ad-8f38012ef305\") " Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.611167 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs" (OuterVolumeSpecName: "logs") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.614133 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl" (OuterVolumeSpecName: "kube-api-access-cb2dl") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "kube-api-access-cb2dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.637824 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data" (OuterVolumeSpecName: "config-data") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.649954 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f69269f-4be5-4302-b2ad-8f38012ef305" (UID: "6f69269f-4be5-4302-b2ad-8f38012ef305"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711546 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711826 4883 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f69269f-4be5-4302-b2ad-8f38012ef305-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711836 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb2dl\" (UniqueName: \"kubernetes.io/projected/6f69269f-4be5-4302-b2ad-8f38012ef305-kube-api-access-cb2dl\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.711848 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f69269f-4be5-4302-b2ad-8f38012ef305-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.953973 4883 generic.go:334] "Generic (PLEG): container finished" podID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" exitCode=0 Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954025 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954031 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954056 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6f69269f-4be5-4302-b2ad-8f38012ef305","Type":"ContainerDied","Data":"f6397c2d10b284f9002d1ac8c830d0d1865e88294fe5c8f541bafc5bd527b7d0"} Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.954074 4883 scope.go:117] "RemoveContainer" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.978267 4883 scope.go:117] "RemoveContainer" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.992625 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.997240 4883 scope.go:117] "RemoveContainer" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: E0310 09:25:28.997871 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": container with ID starting with 4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367 not found: ID does not exist" containerID="4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.997957 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367"} err="failed to get container status \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": rpc error: code = NotFound desc = could not find container \"4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367\": container with ID starting with 4447aa40f8d8985cad10881546322a699e45ad66ca5a7a1fa491c07c231a6367 not found: ID does not exist" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.998008 4883 scope.go:117] "RemoveContainer" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: E0310 09:25:28.998527 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": container with ID starting with 8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6 not found: ID does not exist" containerID="8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6" Mar 10 09:25:28 crc kubenswrapper[4883]: I0310 09:25:28.998572 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6"} err="failed to get container status \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": rpc error: code = NotFound desc = could not find container \"8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6\": container with ID starting with 8205228feb15a15b3a5fe1317a05c8233c349d0c3a7c45442b4fd6cf9895cab6 not found: ID does not exist" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.000928 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.017492 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: E0310 09:25:29.017935 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.017958 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: E0310 09:25:29.018004 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018011 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-api" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.018244 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" containerName="nova-api-log" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.019230 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.021786 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.022001 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.022379 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.028753 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119714 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119781 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119906 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.119981 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.120050 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221583 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221645 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221690 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221734 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.221824 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.222850 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ae000e-33d5-4caa-8b61-dd1ab03b9978-logs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.227944 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-internal-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228020 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228426 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-config-data\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.228852 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ae000e-33d5-4caa-8b61-dd1ab03b9978-public-tls-certs\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.236768 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9l5c\" (UniqueName: \"kubernetes.io/projected/14ae000e-33d5-4caa-8b61-dd1ab03b9978-kube-api-access-s9l5c\") pod \"nova-api-0\" (UID: \"14ae000e-33d5-4caa-8b61-dd1ab03b9978\") " pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.340105 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.751552 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:25:29 crc kubenswrapper[4883]: W0310 09:25:29.757876 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ae000e_33d5_4caa_8b61_dd1ab03b9978.slice/crio-fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c WatchSource:0}: Error finding container fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c: Status 404 returned error can't find the container with id fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.965547 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"4c319601427e6a2871510f3669bb92ab1cb581950f5e3a928817e32ff7fa92f1"} Mar 10 09:25:29 crc kubenswrapper[4883]: I0310 09:25:29.965917 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"fb95f7414f2896871a87b668df7d8fd697323d11035390d802f4e5f55c74059c"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.095854 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f69269f-4be5-4302-b2ad-8f38012ef305" path="/var/lib/kubelet/pods/6f69269f-4be5-4302-b2ad-8f38012ef305/volumes" Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.979126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14ae000e-33d5-4caa-8b61-dd1ab03b9978","Type":"ContainerStarted","Data":"d42705f13688b92d2af86867c60cdd3f936e4cc75d857d04e5a1caf06f5d4373"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerStarted","Data":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982624 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" containerID="cri-o://54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982669 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" containerID="cri-o://40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982723 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" containerID="cri-o://d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982705 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.982725 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" containerID="cri-o://f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" gracePeriod=30 Mar 10 09:25:30 crc kubenswrapper[4883]: I0310 09:25:30.998245 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.998235251 podStartE2EDuration="2.998235251s" podCreationTimestamp="2026-03-10 09:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:25:30.994542958 +0000 UTC m=+1317.249440847" watchObservedRunningTime="2026-03-10 09:25:30.998235251 +0000 UTC m=+1317.253133140" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.019141 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.059719354 podStartE2EDuration="7.019132866s" podCreationTimestamp="2026-03-10 09:25:24 +0000 UTC" firstStartedPulling="2026-03-10 09:25:25.716865229 +0000 UTC m=+1311.971763117" lastFinishedPulling="2026-03-10 09:25:30.676278751 +0000 UTC m=+1316.931176629" observedRunningTime="2026-03-10 09:25:31.010905423 +0000 UTC m=+1317.265803322" watchObservedRunningTime="2026-03-10 09:25:31.019132866 +0000 UTC m=+1317.274030756" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.831954 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978751 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978805 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978883 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.978997 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979082 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979191 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979322 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.979354 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") pod \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\" (UID: \"718b65b8-f5c7-4933-945a-8e5e5dea72a4\") " Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.980092 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.980219 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.985157 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts" (OuterVolumeSpecName: "scripts") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.985736 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6" (OuterVolumeSpecName: "kube-api-access-vwpj6") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "kube-api-access-vwpj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999274 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999336 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999341 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999414 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999370 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" exitCode=2 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999453 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999487 4883 generic.go:334] "Generic (PLEG): container finished" podID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" exitCode=0 Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999505 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999696 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999743 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} Mar 10 09:25:31 crc kubenswrapper[4883]: I0310 09:25:31.999763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718b65b8-f5c7-4933-945a-8e5e5dea72a4","Type":"ContainerDied","Data":"348833b96142a07cd2b1850468f3df69af27cea75ca70ec73df2e8a23d17baed"} Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.006952 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.026390 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.042297 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.055340 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data" (OuterVolumeSpecName: "config-data") pod "718b65b8-f5c7-4933-945a-8e5e5dea72a4" (UID: "718b65b8-f5c7-4933-945a-8e5e5dea72a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.075035 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082465 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwpj6\" (UniqueName: \"kubernetes.io/projected/718b65b8-f5c7-4933-945a-8e5e5dea72a4-kube-api-access-vwpj6\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082512 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082524 4883 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082534 4883 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718b65b8-f5c7-4933-945a-8e5e5dea72a4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082546 4883 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082556 4883 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082566 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.082575 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b65b8-f5c7-4933-945a-8e5e5dea72a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.094729 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.117665 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.143890 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144243 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144278 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144305 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144583 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144605 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144619 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.144895 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144915 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.144930 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.145166 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145189 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145205 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145671 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145694 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145948 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.145969 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146190 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146209 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146408 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146629 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146883 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.146902 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147098 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147116 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147344 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147362 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147702 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.147750 4883 scope.go:117] "RemoveContainer" containerID="d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148022 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f"} err="failed to get container status \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": rpc error: code = NotFound desc = could not find container \"d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f\": container with ID starting with d631b40f6975eec894f6780b96615aa9ca0a7d237599f47f906b0037c84cc97f not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148046 4883 scope.go:117] "RemoveContainer" containerID="40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148250 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04"} err="failed to get container status \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": rpc error: code = NotFound desc = could not find container \"40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04\": container with ID starting with 40678307bf38603545d359aa46168ae76fc479fff4467d0d480faf38e20fde04 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148270 4883 scope.go:117] "RemoveContainer" containerID="f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148445 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be"} err="failed to get container status \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": rpc error: code = NotFound desc = could not find container \"f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be\": container with ID starting with f694255592e32a25abdfbb51149c478f3bbee3799cc5d1b0767e9b182d2123be not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148464 4883 scope.go:117] "RemoveContainer" containerID="54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.148662 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642"} err="failed to get container status \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": rpc error: code = NotFound desc = could not find container \"54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642\": container with ID starting with 54dc4edaff7b0c9b8aa27c7af1746102123308100688f90e1ffcc9461e687642 not found: ID does not exist" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.326982 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.333697 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.347448 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.352777 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353316 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353336 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353387 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353394 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353410 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353417 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: E0310 09:25:32.353458 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353465 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353704 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-notification-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353746 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="proxy-httpd" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353759 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="sg-core" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.353769 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" containerName="ceilometer-central-agent" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.355856 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.357798 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.358117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.358383 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.373190 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.411067 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.411327 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" containerID="cri-o://8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" gracePeriod=10 Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.492919 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.492977 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493019 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493646 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493811 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493877 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.493930 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.494123 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.596616 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597128 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597176 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597296 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597358 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597390 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597437 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.597070 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-log-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.598879 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-run-httpd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.604256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-scripts\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.606962 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.608764 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-config-data\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.609331 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.618728 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqldd\" (UniqueName: \"kubernetes.io/projected/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-kube-api-access-kqldd\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.628823 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0819f125-35db-4a0e-8fff-c1d3d3a27ae7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0819f125-35db-4a0e-8fff-c1d3d3a27ae7\") " pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.670664 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 09:25:32 crc kubenswrapper[4883]: I0310 09:25:32.895902 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.010825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.010961 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011025 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011157 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011197 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.011739 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") pod \"abc326c8-0db0-4645-b1dc-3871b1b4202c\" (UID: \"abc326c8-0db0-4645-b1dc-3871b1b4202c\") " Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016686 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79" (OuterVolumeSpecName: "kube-api-access-hrr79") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "kube-api-access-hrr79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016741 4883 generic.go:334] "Generic (PLEG): container finished" podID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" exitCode=0 Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016802 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" event={"ID":"abc326c8-0db0-4645-b1dc-3871b1b4202c","Type":"ContainerDied","Data":"48534b3ef2da468c813fcefb6a12bb7f5e9a292a4eb15ad03c6e477d12cc26e8"} Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016846 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-sqzrs" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.016862 4883 scope.go:117] "RemoveContainer" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.050157 4883 scope.go:117] "RemoveContainer" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.054028 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.060406 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.060902 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config" (OuterVolumeSpecName: "config") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.061668 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.064525 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "abc326c8-0db0-4645-b1dc-3871b1b4202c" (UID: "abc326c8-0db0-4645-b1dc-3871b1b4202c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.078999 4883 scope.go:117] "RemoveContainer" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: E0310 09:25:33.079751 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": container with ID starting with 8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867 not found: ID does not exist" containerID="8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.079794 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867"} err="failed to get container status \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": rpc error: code = NotFound desc = could not find container \"8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867\": container with ID starting with 8420a220ae62112ec2567ff803f259c40f5eb73eec78966860eaef65d3e75867 not found: ID does not exist" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.079821 4883 scope.go:117] "RemoveContainer" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: E0310 09:25:33.080062 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": container with ID starting with 419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8 not found: ID does not exist" containerID="419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.080085 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8"} err="failed to get container status \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": rpc error: code = NotFound desc = could not find container \"419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8\": container with ID starting with 419fb63c786b86a99f0994d43f9116eeb7c6ddeeb291d764c26060eac680efd8 not found: ID does not exist" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118855 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118892 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118906 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118917 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrr79\" (UniqueName: \"kubernetes.io/projected/abc326c8-0db0-4645-b1dc-3871b1b4202c-kube-api-access-hrr79\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118931 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.118941 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abc326c8-0db0-4645-b1dc-3871b1b4202c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.141506 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 09:25:33 crc kubenswrapper[4883]: W0310 09:25:33.143117 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0819f125_35db_4a0e_8fff_c1d3d3a27ae7.slice/crio-3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251 WatchSource:0}: Error finding container 3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251: Status 404 returned error can't find the container with id 3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251 Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.353168 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:33 crc kubenswrapper[4883]: I0310 09:25:33.362581 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-sqzrs"] Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.028734 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"6b425537c6e483b6f3eaf025c0fafabce4e535c71aad18b280b99b81166e3680"} Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.029141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"3665a6e9a45a41d5d26fb264a72f8e2eb825426ad9a5ff0aa026b9163c9f8251"} Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.094929 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718b65b8-f5c7-4933-945a-8e5e5dea72a4" path="/var/lib/kubelet/pods/718b65b8-f5c7-4933-945a-8e5e5dea72a4/volumes" Mar 10 09:25:34 crc kubenswrapper[4883]: I0310 09:25:34.095735 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" path="/var/lib/kubelet/pods/abc326c8-0db0-4645-b1dc-3871b1b4202c/volumes" Mar 10 09:25:35 crc kubenswrapper[4883]: I0310 09:25:35.045763 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"8722d4d9e78b4e3fdd7d4d7e1afff9fe658e01839db036dd1e53cf059aff5e0c"} Mar 10 09:25:36 crc kubenswrapper[4883]: I0310 09:25:36.055863 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"1a0d31cb723db77d80365324a3a9f5216de56fece3784158f2f794688c8a4f83"} Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.107929 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0819f125-35db-4a0e-8fff-c1d3d3a27ae7","Type":"ContainerStarted","Data":"3eb92d4db72f0dfd8ed05d7c20385851473d8ece9cf57b03c7a2ca0ddd17c0c7"} Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.108458 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.133884 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.023845289 podStartE2EDuration="7.133859322s" podCreationTimestamp="2026-03-10 09:25:32 +0000 UTC" firstStartedPulling="2026-03-10 09:25:33.145955698 +0000 UTC m=+1319.400853588" lastFinishedPulling="2026-03-10 09:25:38.255969732 +0000 UTC m=+1324.510867621" observedRunningTime="2026-03-10 09:25:39.123833577 +0000 UTC m=+1325.378731466" watchObservedRunningTime="2026-03-10 09:25:39.133859322 +0000 UTC m=+1325.388757211" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.341164 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:39 crc kubenswrapper[4883]: I0310 09:25:39.341445 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:25:40 crc kubenswrapper[4883]: I0310 09:25:40.356622 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14ae000e-33d5-4caa-8b61-dd1ab03b9978" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:40 crc kubenswrapper[4883]: I0310 09:25:40.356666 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14ae000e-33d5-4caa-8b61-dd1ab03b9978" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.561536 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:45 crc kubenswrapper[4883]: E0310 09:25:45.562416 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562440 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: E0310 09:25:45.562451 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="init" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562456 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="init" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.562665 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc326c8-0db0-4645-b1dc-3871b1b4202c" containerName="dnsmasq-dns" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.563815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.574562 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581770 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581890 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.581963 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683517 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683668 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683748 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.683934 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.684140 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.705219 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"redhat-marketplace-bcdxb\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:45 crc kubenswrapper[4883]: I0310 09:25:45.881659 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:46 crc kubenswrapper[4883]: I0310 09:25:46.288077 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:46 crc kubenswrapper[4883]: W0310 09:25:46.294807 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a7004ef_7b97_475f_8801_f2097208978d.slice/crio-03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc WatchSource:0}: Error finding container 03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc: Status 404 returned error can't find the container with id 03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206145 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" exitCode=0 Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206599 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b"} Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.206654 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerStarted","Data":"03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc"} Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.448690 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:25:47 crc kubenswrapper[4883]: I0310 09:25:47.448767 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.230192 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" exitCode=0 Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.230319 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255"} Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.347954 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.348656 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.354039 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:25:49 crc kubenswrapper[4883]: I0310 09:25:49.355245 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.245014 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerStarted","Data":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.245391 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.251631 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:25:50 crc kubenswrapper[4883]: I0310 09:25:50.270838 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bcdxb" podStartSLOduration=2.78584722 podStartE2EDuration="5.270812196s" podCreationTimestamp="2026-03-10 09:25:45 +0000 UTC" firstStartedPulling="2026-03-10 09:25:47.20958738 +0000 UTC m=+1333.464485270" lastFinishedPulling="2026-03-10 09:25:49.694552356 +0000 UTC m=+1335.949450246" observedRunningTime="2026-03-10 09:25:50.262707364 +0000 UTC m=+1336.517605253" watchObservedRunningTime="2026-03-10 09:25:50.270812196 +0000 UTC m=+1336.525710085" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.882121 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.882701 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:55 crc kubenswrapper[4883]: I0310 09:25:55.926034 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:56 crc kubenswrapper[4883]: I0310 09:25:56.343817 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:56 crc kubenswrapper[4883]: I0310 09:25:56.385671 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.317255 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bcdxb" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" containerID="cri-o://f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" gracePeriod=2 Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.720282 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.841331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.841731 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.842024 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") pod \"4a7004ef-7b97-475f-8801-f2097208978d\" (UID: \"4a7004ef-7b97-475f-8801-f2097208978d\") " Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.842840 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities" (OuterVolumeSpecName: "utilities") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.859976 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f" (OuterVolumeSpecName: "kube-api-access-pjb5f") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "kube-api-access-pjb5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.864821 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a7004ef-7b97-475f-8801-f2097208978d" (UID: "4a7004ef-7b97-475f-8801-f2097208978d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.943964 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.944089 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7004ef-7b97-475f-8801-f2097208978d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:58 crc kubenswrapper[4883]: I0310 09:25:58.944167 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjb5f\" (UniqueName: \"kubernetes.io/projected/4a7004ef-7b97-475f-8801-f2097208978d-kube-api-access-pjb5f\") on node \"crc\" DevicePath \"\"" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327915 4883 generic.go:334] "Generic (PLEG): container finished" podID="4a7004ef-7b97-475f-8801-f2097208978d" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" exitCode=0 Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327986 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcdxb" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.327983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.328063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcdxb" event={"ID":"4a7004ef-7b97-475f-8801-f2097208978d","Type":"ContainerDied","Data":"03dbd0a754bf7b7da3b2a64b926cb788cb06ea3fe8bc3915b47fa5a8d6035ecc"} Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.328097 4883 scope.go:117] "RemoveContainer" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.353756 4883 scope.go:117] "RemoveContainer" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.365773 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.372832 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcdxb"] Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.375383 4883 scope.go:117] "RemoveContainer" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408534 4883 scope.go:117] "RemoveContainer" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.408915 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": container with ID starting with f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f not found: ID does not exist" containerID="f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408947 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f"} err="failed to get container status \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": rpc error: code = NotFound desc = could not find container \"f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f\": container with ID starting with f176c08a964073c36894ad7b8c8b755d66b9da0dbc28a293606775e1e850792f not found: ID does not exist" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.408971 4883 scope.go:117] "RemoveContainer" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.409213 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": container with ID starting with d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255 not found: ID does not exist" containerID="d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.409237 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255"} err="failed to get container status \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": rpc error: code = NotFound desc = could not find container \"d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255\": container with ID starting with d129b06ff22dab127af23f489e6ed059e2ef93ad003ae7fbdc08c0479d715255 not found: ID does not exist" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.409251 4883 scope.go:117] "RemoveContainer" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: E0310 09:25:59.410221 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": container with ID starting with 83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b not found: ID does not exist" containerID="83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b" Mar 10 09:25:59 crc kubenswrapper[4883]: I0310 09:25:59.410277 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b"} err="failed to get container status \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": rpc error: code = NotFound desc = could not find container \"83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b\": container with ID starting with 83c44729e6954c330a5d4390ff44509fe33860de09975a6e046148bd49cd200b not found: ID does not exist" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.090579 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7004ef-7b97-475f-8801-f2097208978d" path="/var/lib/kubelet/pods/4a7004ef-7b97-475f-8801-f2097208978d/volumes" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138126 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138560 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-content" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138576 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-content" Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138597 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138603 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: E0310 09:26:00.138613 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-utilities" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138619 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="extract-utilities" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.138779 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7004ef-7b97-475f-8801-f2097208978d" containerName="registry-server" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.139369 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.141041 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.141804 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.142287 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.143351 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.271903 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.374370 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.393773 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"auto-csr-approver-29552246-r4bxm\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.453228 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:00 crc kubenswrapper[4883]: I0310 09:26:00.846232 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:26:01 crc kubenswrapper[4883]: I0310 09:26:01.352330 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerStarted","Data":"25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31"} Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.363718 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerStarted","Data":"f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f"} Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.384449 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" podStartSLOduration=1.308223498 podStartE2EDuration="2.384421491s" podCreationTimestamp="2026-03-10 09:26:00 +0000 UTC" firstStartedPulling="2026-03-10 09:26:00.850493987 +0000 UTC m=+1347.105391876" lastFinishedPulling="2026-03-10 09:26:01.92669198 +0000 UTC m=+1348.181589869" observedRunningTime="2026-03-10 09:26:02.376770233 +0000 UTC m=+1348.631668123" watchObservedRunningTime="2026-03-10 09:26:02.384421491 +0000 UTC m=+1348.639319380" Mar 10 09:26:02 crc kubenswrapper[4883]: I0310 09:26:02.683695 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 09:26:03 crc kubenswrapper[4883]: I0310 09:26:03.377036 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerID="f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f" exitCode=0 Mar 10 09:26:03 crc kubenswrapper[4883]: I0310 09:26:03.377093 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerDied","Data":"f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f"} Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.698800 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.867827 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") pod \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\" (UID: \"bcfbbeba-ae1f-4e53-ba68-3cc981395803\") " Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.878577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6" (OuterVolumeSpecName: "kube-api-access-hzbd6") pod "bcfbbeba-ae1f-4e53-ba68-3cc981395803" (UID: "bcfbbeba-ae1f-4e53-ba68-3cc981395803"). InnerVolumeSpecName "kube-api-access-hzbd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:04 crc kubenswrapper[4883]: I0310 09:26:04.971564 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzbd6\" (UniqueName: \"kubernetes.io/projected/bcfbbeba-ae1f-4e53-ba68-3cc981395803-kube-api-access-hzbd6\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396041 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" event={"ID":"bcfbbeba-ae1f-4e53-ba68-3cc981395803","Type":"ContainerDied","Data":"25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31"} Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396523 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dd0b5e43e6c881252f792f5c474af9b797127e8b5b57afb6f82eacfccdfa31" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.396162 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-r4bxm" Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.436617 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:26:05 crc kubenswrapper[4883]: I0310 09:26:05.442207 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-29nz4"] Mar 10 09:26:06 crc kubenswrapper[4883]: I0310 09:26:06.090859 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed80b911-07e4-45b8-9324-dfdf65e5a508" path="/var/lib/kubelet/pods/ed80b911-07e4-45b8-9324-dfdf65e5a508/volumes" Mar 10 09:26:10 crc kubenswrapper[4883]: I0310 09:26:10.697268 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:12 crc kubenswrapper[4883]: I0310 09:26:12.240289 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:15 crc kubenswrapper[4883]: I0310 09:26:15.227338 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" containerID="cri-o://6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" gracePeriod=604796 Mar 10 09:26:16 crc kubenswrapper[4883]: I0310 09:26:16.087631 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" containerID="cri-o://554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" gracePeriod=604797 Mar 10 09:26:17 crc kubenswrapper[4883]: I0310 09:26:17.449629 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:26:17 crc kubenswrapper[4883]: I0310 09:26:17.450023 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:26:19 crc kubenswrapper[4883]: I0310 09:26:19.697591 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 10 09:26:19 crc kubenswrapper[4883]: I0310 09:26:19.973547 4883 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.544124 4883 generic.go:334] "Generic (PLEG): container finished" podID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerID="6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" exitCode=0 Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.544228 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e"} Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.803095 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817407 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817687 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817793 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817818 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817907 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.817986 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.818004 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.818034 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") pod \"cdb6ba72-d1c8-4022-9029-2e18784e1139\" (UID: \"cdb6ba72-d1c8-4022-9029-2e18784e1139\") " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.824226 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.826039 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.826424 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.834626 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm" (OuterVolumeSpecName: "kube-api-access-nh4vm") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "kube-api-access-nh4vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.843624 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info" (OuterVolumeSpecName: "pod-info") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.865672 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.865820 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.872683 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919678 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919809 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh4vm\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-kube-api-access-nh4vm\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919869 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919919 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.919978 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920058 4883 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cdb6ba72-d1c8-4022-9029-2e18784e1139-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920116 4883 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.920170 4883 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cdb6ba72-d1c8-4022-9029-2e18784e1139-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.926927 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data" (OuterVolumeSpecName: "config-data") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.960879 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 09:26:21 crc kubenswrapper[4883]: I0310 09:26:21.980042 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf" (OuterVolumeSpecName: "server-conf") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.012715 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cdb6ba72-d1c8-4022-9029-2e18784e1139" (UID: "cdb6ba72-d1c8-4022-9029-2e18784e1139"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022031 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022060 4883 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022073 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cdb6ba72-d1c8-4022-9029-2e18784e1139-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.022083 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cdb6ba72-d1c8-4022-9029-2e18784e1139-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226386 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226820 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226836 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226856 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226863 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.226882 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.226888 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.227083 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.227102 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" containerName="oc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.228068 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.236229 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.247297 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328688 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328720 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328809 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328828 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328901 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.328933 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430365 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430417 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.430480 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431347 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431392 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431556 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431629 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.431688 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432016 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432338 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.432413 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.455066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"dnsmasq-dns-bfb45b47-4wrl2\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.545823 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553080 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cdb6ba72-d1c8-4022-9029-2e18784e1139","Type":"ContainerDied","Data":"531195218a3dcad57b9a11db1f5738cf463317c7f3a59428a1b0f404415ec848"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553134 4883 scope.go:117] "RemoveContainer" containerID="6b536f8286ba8f67ed7799e690ba32685de9ac7a70392f9e12efea3132a20a4e" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.553238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.555396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556910 4883 generic.go:334] "Generic (PLEG): container finished" podID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" exitCode=0 Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556944 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.556958 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.557017 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07","Type":"ContainerDied","Data":"010dc7b31b375a13abbbdc15bf6ef1b807f020341c6f04245119794152aa7af6"} Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.612319 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.619520 4883 scope.go:117] "RemoveContainer" containerID="cd676384987aa2244a851651d9f3c8c0df5750259fc99e409d5e9d090a18495f" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.621018 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.626850 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.627285 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.627299 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.627313 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.627319 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="setup-container" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.628027 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" containerName="rabbitmq" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.628963 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.635886 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636224 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636358 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636411 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.636979 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.637066 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x4lhh" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.637173 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.658433 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.694182 4883 scope.go:117] "RemoveContainer" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.714723 4883 scope.go:117] "RemoveContainer" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738122 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738161 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738288 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738308 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738357 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738403 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738505 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738526 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738546 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738600 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738657 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") pod \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\" (UID: \"6aa2bcd6-6a54-472f-bd1c-276e6f8caa07\") " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.738984 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739076 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739097 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739131 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739151 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739197 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739230 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739246 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739399 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739415 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.739429 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.740072 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.740678 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.741352 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.743701 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info" (OuterVolumeSpecName: "pod-info") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.747984 4883 scope.go:117] "RemoveContainer" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.749659 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752321 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752416 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs" (OuterVolumeSpecName: "kube-api-access-dnprs") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "kube-api-access-dnprs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.752631 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": container with ID starting with 554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5 not found: ID does not exist" containerID="554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.752800 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5"} err="failed to get container status \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": rpc error: code = NotFound desc = could not find container \"554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5\": container with ID starting with 554ff1ad814d4eeb6b0cf32ac62f5a189d80bc82978a99a55f6cb709ff86dab5 not found: ID does not exist" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.753110 4883 scope.go:117] "RemoveContainer" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: E0310 09:26:22.756573 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": container with ID starting with 8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5 not found: ID does not exist" containerID="8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.756624 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5"} err="failed to get container status \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": rpc error: code = NotFound desc = could not find container \"8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5\": container with ID starting with 8c744b4881c0601635f8b7a8ebe5926a65ac017536c2eca50af7842cf36e30e5 not found: ID does not exist" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.760698 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.783976 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf" (OuterVolumeSpecName: "server-conf") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.786739 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data" (OuterVolumeSpecName: "config-data") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.826371 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" (UID: "6aa2bcd6-6a54-472f-bd1c-276e6f8caa07"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841899 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841920 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841951 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841968 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.841988 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842011 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842027 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842096 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842118 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842134 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842183 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnprs\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-kube-api-access-dnprs\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842195 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842215 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842338 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842393 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842465 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.842775 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843066 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-config-data\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843108 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843228 4883 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843249 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843262 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843274 4883 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843287 4883 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843296 4883 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843305 4883 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.843746 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1be05788-71cf-486a-8142-e317e959bfe9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.844395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.844835 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1be05788-71cf-486a-8142-e317e959bfe9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.846439 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1be05788-71cf-486a-8142-e317e959bfe9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.846686 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.856565 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ssc\" (UniqueName: \"kubernetes.io/projected/1be05788-71cf-486a-8142-e317e959bfe9-kube-api-access-v8ssc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.858850 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.867680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"1be05788-71cf-486a-8142-e317e959bfe9\") " pod="openstack/rabbitmq-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.899459 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.906961 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.926281 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.929142 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933376 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-cjf6k" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933677 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933829 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.933967 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934376 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934428 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.934686 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.939927 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.945696 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:22 crc kubenswrapper[4883]: I0310 09:26:22.992334 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.045578 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047463 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047769 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047806 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047889 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.047908 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048038 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048130 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048326 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048377 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.048400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150289 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150546 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150574 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150618 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150635 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150680 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150702 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150731 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150779 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150799 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.150816 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.151261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.151936 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.152671 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154458 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.154841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.155031 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/170c41ad-d10f-4567-97ec-2b90d149951b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.156753 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.159064 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/170c41ad-d10f-4567-97ec-2b90d149951b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.160259 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/170c41ad-d10f-4567-97ec-2b90d149951b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.164774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkflt\" (UniqueName: \"kubernetes.io/projected/170c41ad-d10f-4567-97ec-2b90d149951b-kube-api-access-dkflt\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.187389 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"170c41ad-d10f-4567-97ec-2b90d149951b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.253948 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.390450 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.573841 4883 generic.go:334] "Generic (PLEG): container finished" podID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" exitCode=0 Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.573977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.574070 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerStarted","Data":"4031676c69275321eb3f6b90e5c6639caa836d24393c80f802273273e9c5edd8"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.578987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"cb7c2c99cd3ab9abab6d15ca157de5d75a91b7b7d0f2bf51780a524cc1b59cca"} Mar 10 09:26:23 crc kubenswrapper[4883]: I0310 09:26:23.662946 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 09:26:23 crc kubenswrapper[4883]: W0310 09:26:23.669451 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod170c41ad_d10f_4567_97ec_2b90d149951b.slice/crio-8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc WatchSource:0}: Error finding container 8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc: Status 404 returned error can't find the container with id 8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.088448 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aa2bcd6-6a54-472f-bd1c-276e6f8caa07" path="/var/lib/kubelet/pods/6aa2bcd6-6a54-472f-bd1c-276e6f8caa07/volumes" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.089643 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb6ba72-d1c8-4022-9029-2e18784e1139" path="/var/lib/kubelet/pods/cdb6ba72-d1c8-4022-9029-2e18784e1139/volumes" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.590985 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"8c88563f9f486927cbf6d91d4fc5131e21074808b77904a07bd82caddf9954cc"} Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.593769 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerStarted","Data":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.594028 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:24 crc kubenswrapper[4883]: I0310 09:26:24.613372 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" podStartSLOduration=2.613359538 podStartE2EDuration="2.613359538s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:24.61029995 +0000 UTC m=+1370.865197839" watchObservedRunningTime="2026-03-10 09:26:24.613359538 +0000 UTC m=+1370.868257428" Mar 10 09:26:25 crc kubenswrapper[4883]: I0310 09:26:25.602628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704"} Mar 10 09:26:25 crc kubenswrapper[4883]: I0310 09:26:25.604631 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c"} Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.556732 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.620510 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.620962 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" containerID="cri-o://b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" gracePeriod=10 Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.709269 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.711642 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.724249 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856321 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856370 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856396 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.856959 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.857279 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.857337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.959945 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960069 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960138 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960189 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.960974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961369 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961537 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-config\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.961822 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.962083 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da34e0af-a084-40fb-93ea-471923c49051-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:32 crc kubenswrapper[4883]: I0310 09:26:32.981307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgldt\" (UniqueName: \"kubernetes.io/projected/da34e0af-a084-40fb-93ea-471923c49051-kube-api-access-pgldt\") pod \"dnsmasq-dns-79fcc958f9-rx8gc\" (UID: \"da34e0af-a084-40fb-93ea-471923c49051\") " pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.060591 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.061265 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165245 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165291 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165467 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165519 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165617 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.165707 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") pod \"3612d60a-476b-48fa-9163-03c2886a64b2\" (UID: \"3612d60a-476b-48fa-9163-03c2886a64b2\") " Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.173857 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg" (OuterVolumeSpecName: "kube-api-access-xfsjg") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "kube-api-access-xfsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.204356 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216016 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config" (OuterVolumeSpecName: "config") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216126 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.216804 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.218194 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3612d60a-476b-48fa-9163-03c2886a64b2" (UID: "3612d60a-476b-48fa-9163-03c2886a64b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271665 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271702 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271716 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271732 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271744 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3612d60a-476b-48fa-9163-03c2886a64b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.271756 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsjg\" (UniqueName: \"kubernetes.io/projected/3612d60a-476b-48fa-9163-03c2886a64b2-kube-api-access-xfsjg\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.481037 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-rx8gc"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.676872 4883 generic.go:334] "Generic (PLEG): container finished" podID="da34e0af-a084-40fb-93ea-471923c49051" containerID="febc316b4590ba1f2958d9ee63f57bb0d076198c0d409531bb6e58901795ea3f" exitCode=0 Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.676990 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerDied","Data":"febc316b4590ba1f2958d9ee63f57bb0d076198c0d409531bb6e58901795ea3f"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.677090 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerStarted","Data":"bf6932deb94311a9f7a44cfb9b41801e5ea7f1f3e46d931fa9a611d58802bf48"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.679951 4883 generic.go:334] "Generic (PLEG): container finished" podID="3612d60a-476b-48fa-9163-03c2886a64b2" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" exitCode=0 Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.679994 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680029 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" event={"ID":"3612d60a-476b-48fa-9163-03c2886a64b2","Type":"ContainerDied","Data":"bc8f46ec7a59322161bc14068b60976298d60f1cfa73a1b7887e26a2e987b797"} Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680041 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-gf7ng" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.680076 4883 scope.go:117] "RemoveContainer" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.735179 4883 scope.go:117] "RemoveContainer" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.740395 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.747686 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-gf7ng"] Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.838521 4883 scope.go:117] "RemoveContainer" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: E0310 09:26:33.839257 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": container with ID starting with b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31 not found: ID does not exist" containerID="b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.839304 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31"} err="failed to get container status \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": rpc error: code = NotFound desc = could not find container \"b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31\": container with ID starting with b07be35b58caf2b7d04ed948d22afd0531896eaaae827c4548dab089afbafb31 not found: ID does not exist" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.839334 4883 scope.go:117] "RemoveContainer" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: E0310 09:26:33.840588 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": container with ID starting with a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae not found: ID does not exist" containerID="a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae" Mar 10 09:26:33 crc kubenswrapper[4883]: I0310 09:26:33.840625 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae"} err="failed to get container status \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": rpc error: code = NotFound desc = could not find container \"a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae\": container with ID starting with a49783b42deb200ba5cd38eef272dd9b4ef308605215dbd9ef86c6fe90cb50ae not found: ID does not exist" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.098898 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" path="/var/lib/kubelet/pods/3612d60a-476b-48fa-9163-03c2886a64b2/volumes" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.688785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" event={"ID":"da34e0af-a084-40fb-93ea-471923c49051","Type":"ContainerStarted","Data":"89f8f38f554753f54883adeb6ee9e04e6496380ec4f61aab451d400118355efd"} Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.688931 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:34 crc kubenswrapper[4883]: I0310 09:26:34.710583 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" podStartSLOduration=2.710566527 podStartE2EDuration="2.710566527s" podCreationTimestamp="2026-03-10 09:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:34.703419851 +0000 UTC m=+1380.958317741" watchObservedRunningTime="2026-03-10 09:26:34.710566527 +0000 UTC m=+1380.965464416" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.062731 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79fcc958f9-rx8gc" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.124071 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.124529 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" containerID="cri-o://ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" gracePeriod=10 Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.532570 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.686571 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687100 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687299 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687772 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687908 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.687991 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.688107 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") pod \"c97c631a-70a2-4d82-87de-84f1d8eecc19\" (UID: \"c97c631a-70a2-4d82-87de-84f1d8eecc19\") " Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.694672 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq" (OuterVolumeSpecName: "kube-api-access-dvtdq") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "kube-api-access-dvtdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.730349 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.731735 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.732663 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735574 4883 generic.go:334] "Generic (PLEG): container finished" podID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" exitCode=0 Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735628 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735684 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" event={"ID":"c97c631a-70a2-4d82-87de-84f1d8eecc19","Type":"ContainerDied","Data":"4031676c69275321eb3f6b90e5c6639caa836d24393c80f802273273e9c5edd8"} Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735704 4883 scope.go:117] "RemoveContainer" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.735716 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-4wrl2" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.736646 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config" (OuterVolumeSpecName: "config") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.737695 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.738382 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c97c631a-70a2-4d82-87de-84f1d8eecc19" (UID: "c97c631a-70a2-4d82-87de-84f1d8eecc19"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792017 4883 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792049 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvtdq\" (UniqueName: \"kubernetes.io/projected/c97c631a-70a2-4d82-87de-84f1d8eecc19-kube-api-access-dvtdq\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792061 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792075 4883 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792085 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792094 4883 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.792103 4883 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c97c631a-70a2-4d82-87de-84f1d8eecc19-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.795811 4883 scope.go:117] "RemoveContainer" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820355 4883 scope.go:117] "RemoveContainer" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: E0310 09:26:38.820739 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": container with ID starting with ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b not found: ID does not exist" containerID="ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820771 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b"} err="failed to get container status \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": rpc error: code = NotFound desc = could not find container \"ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b\": container with ID starting with ad816538cf2f7e76285001405060d470dbed2dd35909f388dc1ed603d0f53f7b not found: ID does not exist" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.820796 4883 scope.go:117] "RemoveContainer" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: E0310 09:26:38.821150 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": container with ID starting with 1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed not found: ID does not exist" containerID="1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed" Mar 10 09:26:38 crc kubenswrapper[4883]: I0310 09:26:38.821189 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed"} err="failed to get container status \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": rpc error: code = NotFound desc = could not find container \"1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed\": container with ID starting with 1705048307aad74e11f96af507b2e7af954e4ccc935cb70ca6dd29e11e4d89ed not found: ID does not exist" Mar 10 09:26:39 crc kubenswrapper[4883]: I0310 09:26:39.067827 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:39 crc kubenswrapper[4883]: I0310 09:26:39.077216 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-4wrl2"] Mar 10 09:26:40 crc kubenswrapper[4883]: I0310 09:26:40.092165 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" path="/var/lib/kubelet/pods/c97c631a-70a2-4d82-87de-84f1d8eecc19/volumes" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.449395 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450049 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450106 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450715 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.450773 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" gracePeriod=600 Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823209 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" exitCode=0 Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823282 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225"} Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} Mar 10 09:26:47 crc kubenswrapper[4883]: I0310 09:26:47.823542 4883 scope.go:117] "RemoveContainer" containerID="baa5421b80377ba5964153032e8442f67866ecaee882a6a6d6389530ab32d1e3" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.748087 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755879 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755895 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755916 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755938 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755944 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: E0310 09:26:51.755959 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.755965 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="init" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756111 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3612d60a-476b-48fa-9163-03c2886a64b2" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756128 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97c631a-70a2-4d82-87de-84f1d8eecc19" containerName="dnsmasq-dns" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756616 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.756707 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.758843 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.759063 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.760813 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.760905 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864156 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864639 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.864691 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.966547 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.966877 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.967046 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.967267 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.973460 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.973964 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.974260 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:51 crc kubenswrapper[4883]: I0310 09:26:51.981098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.070830 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.553779 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7"] Mar 10 09:26:52 crc kubenswrapper[4883]: I0310 09:26:52.912760 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerStarted","Data":"fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be"} Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.954067 4883 generic.go:334] "Generic (PLEG): container finished" podID="1be05788-71cf-486a-8142-e317e959bfe9" containerID="e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704" exitCode=0 Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.954170 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerDied","Data":"e068ac8b65990b981161335beb6d85601d5b7c6701979113d5246e6885684704"} Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.960318 4883 generic.go:334] "Generic (PLEG): container finished" podID="170c41ad-d10f-4567-97ec-2b90d149951b" containerID="40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c" exitCode=0 Mar 10 09:26:56 crc kubenswrapper[4883]: I0310 09:26:56.960405 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerDied","Data":"40718267ba86811b3e6ea0c37d5a13d023e91c52a4ea68253cd8019fec0ff03c"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.975355 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"170c41ad-d10f-4567-97ec-2b90d149951b","Type":"ContainerStarted","Data":"30c55dba4871c6db10a361237096387128c315da62812b4241a38b5f16d1fdab"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.976774 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.980626 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1be05788-71cf-486a-8142-e317e959bfe9","Type":"ContainerStarted","Data":"d20fb133c019c39c2aef7a4f1f09bf5b11372d75dcd388ad04aab69eaa5a8b62"} Mar 10 09:26:57 crc kubenswrapper[4883]: I0310 09:26:57.980863 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 09:26:58 crc kubenswrapper[4883]: I0310 09:26:58.007526 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.007508786 podStartE2EDuration="36.007508786s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:57.99510652 +0000 UTC m=+1404.250004408" watchObservedRunningTime="2026-03-10 09:26:58.007508786 +0000 UTC m=+1404.262406674" Mar 10 09:26:58 crc kubenswrapper[4883]: I0310 09:26:58.021093 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.02107607 podStartE2EDuration="36.02107607s" podCreationTimestamp="2026-03-10 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:26:58.018611844 +0000 UTC m=+1404.273509732" watchObservedRunningTime="2026-03-10 09:26:58.02107607 +0000 UTC m=+1404.275973959" Mar 10 09:27:00 crc kubenswrapper[4883]: I0310 09:27:00.559144 4883 scope.go:117] "RemoveContainer" containerID="7f58830e2474379901c1e5ffc8128426c4b2e0d855c9a6d20d780eb3f2958fa5" Mar 10 09:27:04 crc kubenswrapper[4883]: I0310 09:27:04.050551 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerStarted","Data":"fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b"} Mar 10 09:27:04 crc kubenswrapper[4883]: I0310 09:27:04.069872 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" podStartSLOduration=1.905657462 podStartE2EDuration="13.06983589s" podCreationTimestamp="2026-03-10 09:26:51 +0000 UTC" firstStartedPulling="2026-03-10 09:26:52.560778998 +0000 UTC m=+1398.815676887" lastFinishedPulling="2026-03-10 09:27:03.724957426 +0000 UTC m=+1409.979855315" observedRunningTime="2026-03-10 09:27:04.067753535 +0000 UTC m=+1410.322651423" watchObservedRunningTime="2026-03-10 09:27:04.06983589 +0000 UTC m=+1410.324733780" Mar 10 09:27:12 crc kubenswrapper[4883]: I0310 09:27:12.995661 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 09:27:13 crc kubenswrapper[4883]: I0310 09:27:13.257678 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 09:27:15 crc kubenswrapper[4883]: I0310 09:27:15.148685 4883 generic.go:334] "Generic (PLEG): container finished" podID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerID="fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b" exitCode=0 Mar 10 09:27:15 crc kubenswrapper[4883]: I0310 09:27:15.148794 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerDied","Data":"fcf4883556625c19a1455d845635bedb77c5247e864a5aa77d1d91768b67866b"} Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.552100 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689030 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689141 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689541 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.689652 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") pod \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\" (UID: \"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9\") " Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.695629 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.696288 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25" (OuterVolumeSpecName: "kube-api-access-qdf25") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "kube-api-access-qdf25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.713320 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory" (OuterVolumeSpecName: "inventory") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.719352 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" (UID: "4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792579 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792615 4883 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792630 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdf25\" (UniqueName: \"kubernetes.io/projected/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-kube-api-access-qdf25\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:16 crc kubenswrapper[4883]: I0310 09:27:16.792639 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168579 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" event={"ID":"4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9","Type":"ContainerDied","Data":"fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be"} Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168959 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fca1a18a7932a35b0c2408ef7f3f6f3577a7605fbf02ca9e7d5424a6dc0043be" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.168622 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.235559 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:17 crc kubenswrapper[4883]: E0310 09:27:17.235920 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.235938 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.236116 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.236663 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238490 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238749 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.238946 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.239172 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.249147 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.402826 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.402897 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.403029 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505091 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505175 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.505223 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.511158 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.511914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.521050 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pf4n9\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:17 crc kubenswrapper[4883]: I0310 09:27:17.550566 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:18 crc kubenswrapper[4883]: W0310 09:27:18.016748 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3461a81_abbe_4c3e_88ca_42eff1eeb14e.slice/crio-6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3 WatchSource:0}: Error finding container 6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3: Status 404 returned error can't find the container with id 6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3 Mar 10 09:27:18 crc kubenswrapper[4883]: I0310 09:27:18.018175 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9"] Mar 10 09:27:18 crc kubenswrapper[4883]: I0310 09:27:18.187877 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerStarted","Data":"6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3"} Mar 10 09:27:19 crc kubenswrapper[4883]: I0310 09:27:19.199408 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerStarted","Data":"eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc"} Mar 10 09:27:19 crc kubenswrapper[4883]: I0310 09:27:19.224760 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" podStartSLOduration=1.674391977 podStartE2EDuration="2.224745314s" podCreationTimestamp="2026-03-10 09:27:17 +0000 UTC" firstStartedPulling="2026-03-10 09:27:18.019769242 +0000 UTC m=+1424.274667130" lastFinishedPulling="2026-03-10 09:27:18.570122578 +0000 UTC m=+1424.825020467" observedRunningTime="2026-03-10 09:27:19.213387817 +0000 UTC m=+1425.468285706" watchObservedRunningTime="2026-03-10 09:27:19.224745314 +0000 UTC m=+1425.479643203" Mar 10 09:27:21 crc kubenswrapper[4883]: I0310 09:27:21.218281 4883 generic.go:334] "Generic (PLEG): container finished" podID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerID="eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc" exitCode=0 Mar 10 09:27:21 crc kubenswrapper[4883]: I0310 09:27:21.218358 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerDied","Data":"eb4b3058a2017f66deb4eca98d6a66f7aa07bc8b8282766cfccdcf2f21aeb9bc"} Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.586948 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.707839 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.707985 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.708035 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") pod \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\" (UID: \"d3461a81-abbe-4c3e-88ca-42eff1eeb14e\") " Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.714692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c" (OuterVolumeSpecName: "kube-api-access-5c27c") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "kube-api-access-5c27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.734213 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory" (OuterVolumeSpecName: "inventory") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.736373 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d3461a81-abbe-4c3e-88ca-42eff1eeb14e" (UID: "d3461a81-abbe-4c3e-88ca-42eff1eeb14e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810759 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810800 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:22 crc kubenswrapper[4883]: I0310 09:27:22.810814 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c27c\" (UniqueName: \"kubernetes.io/projected/d3461a81-abbe-4c3e-88ca-42eff1eeb14e-kube-api-access-5c27c\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240647 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" event={"ID":"d3461a81-abbe-4c3e-88ca-42eff1eeb14e","Type":"ContainerDied","Data":"6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3"} Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240981 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6494c244889c43b9908a2f490da5c27b9aec8165fca3fb1f3a3a6df858600ed3" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.240740 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pf4n9" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.390866 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:23 crc kubenswrapper[4883]: E0310 09:27:23.391295 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.391314 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.391502 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3461a81-abbe-4c3e-88ca-42eff1eeb14e" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.392188 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.394096 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.396719 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.398986 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.399188 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.399336 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422412 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422687 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.422869 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.423089 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524013 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524140 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.524205 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.529499 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.530213 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.530783 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.538155 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:23 crc kubenswrapper[4883]: I0310 09:27:23.716032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:27:24 crc kubenswrapper[4883]: I0310 09:27:24.238707 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r"] Mar 10 09:27:24 crc kubenswrapper[4883]: I0310 09:27:24.253706 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerStarted","Data":"733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602"} Mar 10 09:27:25 crc kubenswrapper[4883]: I0310 09:27:25.267753 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerStarted","Data":"5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888"} Mar 10 09:27:25 crc kubenswrapper[4883]: I0310 09:27:25.288337 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" podStartSLOduration=1.804470642 podStartE2EDuration="2.288320412s" podCreationTimestamp="2026-03-10 09:27:23 +0000 UTC" firstStartedPulling="2026-03-10 09:27:24.241953688 +0000 UTC m=+1430.496851577" lastFinishedPulling="2026-03-10 09:27:24.725803458 +0000 UTC m=+1430.980701347" observedRunningTime="2026-03-10 09:27:25.284344394 +0000 UTC m=+1431.539242283" watchObservedRunningTime="2026-03-10 09:27:25.288320412 +0000 UTC m=+1431.543218301" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.078139 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.081613 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.098295 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261219 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261381 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.261404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363277 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363350 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363374 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363884 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.363930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.381739 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"redhat-operators-zvb5h\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.404747 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:47 crc kubenswrapper[4883]: I0310 09:27:47.812137 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468298 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" exitCode=0 Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468414 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f"} Mar 10 09:27:48 crc kubenswrapper[4883]: I0310 09:27:48.468798 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"23cbf9373eb80662f59c04d25e32dcc2645848ba76f31abb0fbd36eb95635d39"} Mar 10 09:27:49 crc kubenswrapper[4883]: I0310 09:27:49.482394 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} Mar 10 09:27:50 crc kubenswrapper[4883]: I0310 09:27:50.494706 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" exitCode=0 Mar 10 09:27:50 crc kubenswrapper[4883]: I0310 09:27:50.494775 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} Mar 10 09:27:51 crc kubenswrapper[4883]: I0310 09:27:51.507938 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerStarted","Data":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} Mar 10 09:27:51 crc kubenswrapper[4883]: I0310 09:27:51.526274 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvb5h" podStartSLOduration=2.057024177 podStartE2EDuration="4.526255182s" podCreationTimestamp="2026-03-10 09:27:47 +0000 UTC" firstStartedPulling="2026-03-10 09:27:48.470896501 +0000 UTC m=+1454.725794390" lastFinishedPulling="2026-03-10 09:27:50.940127506 +0000 UTC m=+1457.195025395" observedRunningTime="2026-03-10 09:27:51.524441772 +0000 UTC m=+1457.779339661" watchObservedRunningTime="2026-03-10 09:27:51.526255182 +0000 UTC m=+1457.781153071" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.405421 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.406044 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.449635 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.612377 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:27:57 crc kubenswrapper[4883]: I0310 09:27:57.681054 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:27:59 crc kubenswrapper[4883]: I0310 09:27:59.590490 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvb5h" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" containerID="cri-o://b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" gracePeriod=2 Mar 10 09:27:59 crc kubenswrapper[4883]: I0310 09:27:59.961174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138567 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.138960 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-content" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138977 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-content" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.138987 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.138994 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.139015 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-utilities" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139021 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="extract-utilities" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139184 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerName="registry-server" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.139781 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141461 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141701 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.141892 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.150739 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152728 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152759 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.152928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") pod \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\" (UID: \"bcd77dbb-628a-4eb6-9910-6d75adb8025c\") " Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.153578 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities" (OuterVolumeSpecName: "utilities") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.158958 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz" (OuterVolumeSpecName: "kube-api-access-zrlbz") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "kube-api-access-zrlbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.256999 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcd77dbb-628a-4eb6-9910-6d75adb8025c" (UID: "bcd77dbb-628a-4eb6-9910-6d75adb8025c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257049 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257354 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257530 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlbz\" (UniqueName: \"kubernetes.io/projected/bcd77dbb-628a-4eb6-9910-6d75adb8025c-kube-api-access-zrlbz\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.257579 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcd77dbb-628a-4eb6-9910-6d75adb8025c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.358633 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.376321 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"auto-csr-approver-29552248-gvmzc\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.454173 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609589 4883 generic.go:334] "Generic (PLEG): container finished" podID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" exitCode=0 Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609935 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.609984 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvb5h" event={"ID":"bcd77dbb-628a-4eb6-9910-6d75adb8025c","Type":"ContainerDied","Data":"23cbf9373eb80662f59c04d25e32dcc2645848ba76f31abb0fbd36eb95635d39"} Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.610018 4883 scope.go:117] "RemoveContainer" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.610219 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvb5h" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.657906 4883 scope.go:117] "RemoveContainer" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.667252 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.675015 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvb5h"] Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.694826 4883 scope.go:117] "RemoveContainer" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740458 4883 scope.go:117] "RemoveContainer" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.740900 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": container with ID starting with b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c not found: ID does not exist" containerID="b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740953 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c"} err="failed to get container status \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": rpc error: code = NotFound desc = could not find container \"b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c\": container with ID starting with b1222c7be3e3bae45d6ad7b3583cdf8595b043ae6c2290446569cf79c5d25a0c not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.740989 4883 scope.go:117] "RemoveContainer" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.741334 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": container with ID starting with 60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1 not found: ID does not exist" containerID="60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741366 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1"} err="failed to get container status \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": rpc error: code = NotFound desc = could not find container \"60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1\": container with ID starting with 60c93a4d51bd63ee407d21caec01eccba541939319c4b11a768f1aa49aa57fd1 not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741389 4883 scope.go:117] "RemoveContainer" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: E0310 09:28:00.741751 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": container with ID starting with bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f not found: ID does not exist" containerID="bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.741777 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f"} err="failed to get container status \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": rpc error: code = NotFound desc = could not find container \"bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f\": container with ID starting with bfceb0ad602d25cb562ab27873c813a02f20ab64ff128cea70dbadf490c8d44f not found: ID does not exist" Mar 10 09:28:00 crc kubenswrapper[4883]: I0310 09:28:00.857067 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:28:01 crc kubenswrapper[4883]: I0310 09:28:01.620754 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerStarted","Data":"ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0"} Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.090855 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd77dbb-628a-4eb6-9910-6d75adb8025c" path="/var/lib/kubelet/pods/bcd77dbb-628a-4eb6-9910-6d75adb8025c/volumes" Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.633868 4883 generic.go:334] "Generic (PLEG): container finished" podID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerID="bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2" exitCode=0 Mar 10 09:28:02 crc kubenswrapper[4883]: I0310 09:28:02.633986 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerDied","Data":"bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2"} Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.762779 4883 scope.go:117] "RemoveContainer" containerID="86f2b3c9600146e785777999cdc5d4ea906b5ad635853fcbc695d4a1b48ea493" Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.915080 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.936500 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") pod \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\" (UID: \"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce\") " Mar 10 09:28:03 crc kubenswrapper[4883]: I0310 09:28:03.942093 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl" (OuterVolumeSpecName: "kube-api-access-7svsl") pod "804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" (UID: "804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce"). InnerVolumeSpecName "kube-api-access-7svsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.040138 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7svsl\" (UniqueName: \"kubernetes.io/projected/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce-kube-api-access-7svsl\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652314 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" event={"ID":"804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce","Type":"ContainerDied","Data":"ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0"} Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652691 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca84c9526f517c8e62b25d8452228a7e93d95d315eeae37348887444851278b0" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.652386 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-gvmzc" Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.970246 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:28:04 crc kubenswrapper[4883]: I0310 09:28:04.976533 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-kz9jr"] Mar 10 09:28:06 crc kubenswrapper[4883]: I0310 09:28:06.089732 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cada45-6ba5-4db1-9a13-3de652b390bb" path="/var/lib/kubelet/pods/12cada45-6ba5-4db1-9a13-3de652b390bb/volumes" Mar 10 09:28:47 crc kubenswrapper[4883]: I0310 09:28:47.449501 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:28:47 crc kubenswrapper[4883]: I0310 09:28:47.450118 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.848983 4883 scope.go:117] "RemoveContainer" containerID="ed2fdfaa94f84ea6d60f3a225de89340f67d80aec7fa6c18b378eb49e482b9b8" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.889141 4883 scope.go:117] "RemoveContainer" containerID="969226cfd74c44f03958486fd0daa3fd76ae699fc5b9689135438015e7c4fbc5" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.911117 4883 scope.go:117] "RemoveContainer" containerID="abd0d4cdc5283d36393d37385d0b14c08a59b31c42729a63940bf76c4dcfaa0e" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.938205 4883 scope.go:117] "RemoveContainer" containerID="8a043f976dc0ec960bd4342407fe8ea99f6aca698feb5f8a45170e566caea1a7" Mar 10 09:29:03 crc kubenswrapper[4883]: I0310 09:29:03.953563 4883 scope.go:117] "RemoveContainer" containerID="11797267a9434882035fa9951a681f2d2f7a8ff3989245476cda7f186fafdc21" Mar 10 09:29:17 crc kubenswrapper[4883]: I0310 09:29:17.449025 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:29:17 crc kubenswrapper[4883]: I0310 09:29:17.449678 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.448537 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449194 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449256 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449803 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.449858 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" gracePeriod=600 Mar 10 09:29:47 crc kubenswrapper[4883]: E0310 09:29:47.572347 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636343 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" exitCode=0 Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636396 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b"} Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.636441 4883 scope.go:117] "RemoveContainer" containerID="3173da1ffe8f7435768fc32262b7cd69b641562a52821e6d42f580eafd7ff225" Mar 10 09:29:47 crc kubenswrapper[4883]: I0310 09:29:47.637161 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:29:47 crc kubenswrapper[4883]: E0310 09:29:47.637390 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:29:58 crc kubenswrapper[4883]: I0310 09:29:58.080228 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:29:58 crc kubenswrapper[4883]: E0310 09:29:58.081082 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.140059 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: E0310 09:30:00.141778 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.141856 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.142158 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.143113 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146089 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146313 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.146856 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.148039 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.240078 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.241694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.243888 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.244854 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.246784 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.313057 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.414967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415149 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.415251 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.433152 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"auto-csr-approver-29552250-bzz2p\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.462968 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.518371 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.519565 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.520009 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.521272 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.524449 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.533409 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"collect-profiles-29552250-c4zv6\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.557509 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.860536 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.871510 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:30:00 crc kubenswrapper[4883]: I0310 09:30:00.977244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6"] Mar 10 09:30:00 crc kubenswrapper[4883]: W0310 09:30:00.980961 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda598a9af_7896_474b_8a2d_8b912f1e867f.slice/crio-b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215 WatchSource:0}: Error finding container b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215: Status 404 returned error can't find the container with id b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215 Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.764290 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerStarted","Data":"c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231"} Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766460 4883 generic.go:334] "Generic (PLEG): container finished" podID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerID="007d3e7664b78fdbe1537c1501b8bf98b8877dc94b31a1d901b914a86cb7fa02" exitCode=0 Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerDied","Data":"007d3e7664b78fdbe1537c1501b8bf98b8877dc94b31a1d901b914a86cb7fa02"} Mar 10 09:30:01 crc kubenswrapper[4883]: I0310 09:30:01.766541 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerStarted","Data":"b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215"} Mar 10 09:30:02 crc kubenswrapper[4883]: I0310 09:30:02.778378 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerStarted","Data":"55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089"} Mar 10 09:30:02 crc kubenswrapper[4883]: I0310 09:30:02.797996 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" podStartSLOduration=1.241920905 podStartE2EDuration="2.797975532s" podCreationTimestamp="2026-03-10 09:30:00 +0000 UTC" firstStartedPulling="2026-03-10 09:30:00.871206703 +0000 UTC m=+1587.126104592" lastFinishedPulling="2026-03-10 09:30:02.42726133 +0000 UTC m=+1588.682159219" observedRunningTime="2026-03-10 09:30:02.794449323 +0000 UTC m=+1589.049347212" watchObservedRunningTime="2026-03-10 09:30:02.797975532 +0000 UTC m=+1589.052873421" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.076183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272307 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272387 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.272568 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") pod \"a598a9af-7896-474b-8a2d-8b912f1e867f\" (UID: \"a598a9af-7896-474b-8a2d-8b912f1e867f\") " Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.273775 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.277671 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.278397 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk" (OuterVolumeSpecName: "kube-api-access-4ntnk") pod "a598a9af-7896-474b-8a2d-8b912f1e867f" (UID: "a598a9af-7896-474b-8a2d-8b912f1e867f"). InnerVolumeSpecName "kube-api-access-4ntnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374558 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a598a9af-7896-474b-8a2d-8b912f1e867f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374609 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ntnk\" (UniqueName: \"kubernetes.io/projected/a598a9af-7896-474b-8a2d-8b912f1e867f-kube-api-access-4ntnk\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.374619 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a598a9af-7896-474b-8a2d-8b912f1e867f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.791961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" event={"ID":"a598a9af-7896-474b-8a2d-8b912f1e867f","Type":"ContainerDied","Data":"b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215"} Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.792328 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38fc3101cd9e0c4e2d833a3c4a54b95259742f41b6955bec8a9cff5f2a9f215" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.792196 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-c4zv6" Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.794423 4883 generic.go:334] "Generic (PLEG): container finished" podID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerID="55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089" exitCode=0 Mar 10 09:30:03 crc kubenswrapper[4883]: I0310 09:30:03.794468 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerDied","Data":"55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089"} Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.088117 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.207331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") pod \"aea7fca8-0ec0-44f9-b729-2c150761519f\" (UID: \"aea7fca8-0ec0-44f9-b729-2c150761519f\") " Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.211767 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh" (OuterVolumeSpecName: "kube-api-access-dg4sh") pod "aea7fca8-0ec0-44f9-b729-2c150761519f" (UID: "aea7fca8-0ec0-44f9-b729-2c150761519f"). InnerVolumeSpecName "kube-api-access-dg4sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.310227 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4sh\" (UniqueName: \"kubernetes.io/projected/aea7fca8-0ec0-44f9-b729-2c150761519f-kube-api-access-dg4sh\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816141 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" event={"ID":"aea7fca8-0ec0-44f9-b729-2c150761519f","Type":"ContainerDied","Data":"c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231"} Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816191 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-bzz2p" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.816205 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7563826ba891aedae5e9849bbbf3e1d93261f7df06477e9ab1e881e2e951231" Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.856938 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:30:05 crc kubenswrapper[4883]: I0310 09:30:05.863740 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-zwxrg"] Mar 10 09:30:06 crc kubenswrapper[4883]: I0310 09:30:06.090290 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391543cc-519b-4e01-8886-04bde62c5298" path="/var/lib/kubelet/pods/391543cc-519b-4e01-8886-04bde62c5298/volumes" Mar 10 09:30:13 crc kubenswrapper[4883]: I0310 09:30:13.080852 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:13 crc kubenswrapper[4883]: E0310 09:30:13.081520 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:24 crc kubenswrapper[4883]: I0310 09:30:24.085951 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:24 crc kubenswrapper[4883]: E0310 09:30:24.087105 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:28 crc kubenswrapper[4883]: I0310 09:30:28.027730 4883 generic.go:334] "Generic (PLEG): container finished" podID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerID="5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888" exitCode=0 Mar 10 09:30:28 crc kubenswrapper[4883]: I0310 09:30:28.027835 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerDied","Data":"5043f618c5751d9b1d780a8c20af9397bbb12825aec770e998b692d0b3a30888"} Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.382148 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.482804 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483042 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483076 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.483333 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") pod \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\" (UID: \"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd\") " Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.488866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.489871 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq" (OuterVolumeSpecName: "kube-api-access-vqgvq") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "kube-api-access-vqgvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.506305 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.509155 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory" (OuterVolumeSpecName: "inventory") pod "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" (UID: "de8c98db-31db-4ecd-83f2-c53d4bdd2ddd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585182 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqgvq\" (UniqueName: \"kubernetes.io/projected/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-kube-api-access-vqgvq\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585213 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585224 4883 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:29 crc kubenswrapper[4883]: I0310 09:30:29.585237 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de8c98db-31db-4ecd-83f2-c53d4bdd2ddd-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052559 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" event={"ID":"de8c98db-31db-4ecd-83f2-c53d4bdd2ddd","Type":"ContainerDied","Data":"733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602"} Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052627 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="733b4fd44a5ac5d10292b081b7fba6dec6ce4014fc5f18ef316d40cce4e76602" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.052726 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.124612 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125112 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125134 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125150 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125156 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: E0310 09:30:30.125176 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125184 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125370 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8c98db-31db-4ecd-83f2-c53d4bdd2ddd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125388 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a598a9af-7896-474b-8a2d-8b912f1e867f" containerName="collect-profiles" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.125400 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" containerName="oc" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.126071 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129082 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129335 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.129948 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.130113 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.147746 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194592 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194745 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.194808 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.298375 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.298884 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.299290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.303963 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.304234 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.318075 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.440086 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:30:30 crc kubenswrapper[4883]: I0310 09:30:30.909925 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr"] Mar 10 09:30:31 crc kubenswrapper[4883]: I0310 09:30:31.063250 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerStarted","Data":"d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd"} Mar 10 09:30:32 crc kubenswrapper[4883]: I0310 09:30:32.076855 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerStarted","Data":"6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79"} Mar 10 09:30:32 crc kubenswrapper[4883]: I0310 09:30:32.099964 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" podStartSLOduration=1.437661975 podStartE2EDuration="2.099942071s" podCreationTimestamp="2026-03-10 09:30:30 +0000 UTC" firstStartedPulling="2026-03-10 09:30:30.913239466 +0000 UTC m=+1617.168137355" lastFinishedPulling="2026-03-10 09:30:31.575519561 +0000 UTC m=+1617.830417451" observedRunningTime="2026-03-10 09:30:32.093008987 +0000 UTC m=+1618.347906876" watchObservedRunningTime="2026-03-10 09:30:32.099942071 +0000 UTC m=+1618.354839959" Mar 10 09:30:37 crc kubenswrapper[4883]: I0310 09:30:37.080036 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:37 crc kubenswrapper[4883]: E0310 09:30:37.081013 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.146207 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.148760 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.158828 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191297 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191392 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.191469 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.292873 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293398 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293522 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.293712 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.294183 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.311798 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"certified-operators-6v8n9\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.465433 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:45 crc kubenswrapper[4883]: I0310 09:30:45.939386 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217749 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" exitCode=0 Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217838 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d"} Mar 10 09:30:46 crc kubenswrapper[4883]: I0310 09:30:46.217991 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerStarted","Data":"6219b9d72ad89a4d1269b3954abbe663509aa516271b361aed1997777f6d30fc"} Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.079910 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:48 crc kubenswrapper[4883]: E0310 09:30:48.080880 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.241348 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" exitCode=0 Mar 10 09:30:48 crc kubenswrapper[4883]: I0310 09:30:48.241415 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7"} Mar 10 09:30:49 crc kubenswrapper[4883]: I0310 09:30:49.251864 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerStarted","Data":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} Mar 10 09:30:49 crc kubenswrapper[4883]: I0310 09:30:49.272843 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6v8n9" podStartSLOduration=1.740131126 podStartE2EDuration="4.272826526s" podCreationTimestamp="2026-03-10 09:30:45 +0000 UTC" firstStartedPulling="2026-03-10 09:30:46.219393481 +0000 UTC m=+1632.474291370" lastFinishedPulling="2026-03-10 09:30:48.752088881 +0000 UTC m=+1635.006986770" observedRunningTime="2026-03-10 09:30:49.266506739 +0000 UTC m=+1635.521404628" watchObservedRunningTime="2026-03-10 09:30:49.272826526 +0000 UTC m=+1635.527724405" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.466101 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.466678 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:55 crc kubenswrapper[4883]: I0310 09:30:55.511309 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:56 crc kubenswrapper[4883]: I0310 09:30:56.367606 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:56 crc kubenswrapper[4883]: I0310 09:30:56.410208 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.345548 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6v8n9" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" containerID="cri-o://0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" gracePeriod=2 Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.779566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.901894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.902053 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.902181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") pod \"7873bd43-b295-4b71-bc81-1d7c3a894778\" (UID: \"7873bd43-b295-4b71-bc81-1d7c3a894778\") " Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.903312 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities" (OuterVolumeSpecName: "utilities") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.908621 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8" (OuterVolumeSpecName: "kube-api-access-d5wh8") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "kube-api-access-d5wh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:58 crc kubenswrapper[4883]: I0310 09:30:58.953561 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7873bd43-b295-4b71-bc81-1d7c3a894778" (UID: "7873bd43-b295-4b71-bc81-1d7c3a894778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006310 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5wh8\" (UniqueName: \"kubernetes.io/projected/7873bd43-b295-4b71-bc81-1d7c3a894778-kube-api-access-d5wh8\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006361 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.006375 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7873bd43-b295-4b71-bc81-1d7c3a894778-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.081596 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.082188 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359309 4883 generic.go:334] "Generic (PLEG): container finished" podID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" exitCode=0 Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359377 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359411 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v8n9" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359435 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v8n9" event={"ID":"7873bd43-b295-4b71-bc81-1d7c3a894778","Type":"ContainerDied","Data":"6219b9d72ad89a4d1269b3954abbe663509aa516271b361aed1997777f6d30fc"} Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.359459 4883 scope.go:117] "RemoveContainer" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.382327 4883 scope.go:117] "RemoveContainer" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.393551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.401340 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6v8n9"] Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.422006 4883 scope.go:117] "RemoveContainer" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.440950 4883 scope.go:117] "RemoveContainer" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.441509 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": container with ID starting with 0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22 not found: ID does not exist" containerID="0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441550 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22"} err="failed to get container status \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": rpc error: code = NotFound desc = could not find container \"0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22\": container with ID starting with 0b0894289ff5844dfc3dacffbea8562a8eab7b11cfe8cc912a4ae4a73867fa22 not found: ID does not exist" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441577 4883 scope.go:117] "RemoveContainer" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.441916 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": container with ID starting with a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7 not found: ID does not exist" containerID="a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441948 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7"} err="failed to get container status \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": rpc error: code = NotFound desc = could not find container \"a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7\": container with ID starting with a35961da7ad61a131f78a498b9999879de9613eaf2eb545c63415d118821e7a7 not found: ID does not exist" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.441970 4883 scope.go:117] "RemoveContainer" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: E0310 09:30:59.442259 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": container with ID starting with 03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d not found: ID does not exist" containerID="03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d" Mar 10 09:30:59 crc kubenswrapper[4883]: I0310 09:30:59.442322 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d"} err="failed to get container status \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": rpc error: code = NotFound desc = could not find container \"03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d\": container with ID starting with 03157fe2f5a329e0611ec853785722babd1542ed103b4e2a9177669d1e6d335d not found: ID does not exist" Mar 10 09:31:00 crc kubenswrapper[4883]: I0310 09:31:00.094827 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" path="/var/lib/kubelet/pods/7873bd43-b295-4b71-bc81-1d7c3a894778/volumes" Mar 10 09:31:04 crc kubenswrapper[4883]: I0310 09:31:04.077661 4883 scope.go:117] "RemoveContainer" containerID="6ae90501c695dbc91d1432637dfb47eec6c627c8a17eb905f14422cd13154c3c" Mar 10 09:31:13 crc kubenswrapper[4883]: I0310 09:31:13.080283 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:13 crc kubenswrapper[4883]: E0310 09:31:13.081233 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:26 crc kubenswrapper[4883]: I0310 09:31:26.080698 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:26 crc kubenswrapper[4883]: E0310 09:31:26.081392 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:39 crc kubenswrapper[4883]: I0310 09:31:39.080793 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:39 crc kubenswrapper[4883]: E0310 09:31:39.081687 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.038111 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.044266 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.051778 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.056871 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mj8nd"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.061936 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8903-account-create-update-lxrp4"] Mar 10 09:31:47 crc kubenswrapper[4883]: I0310 09:31:47.067034 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pv8r6"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.029807 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.036059 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.041483 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.047445 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c7c6-account-create-update-bzdlt"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.053254 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d500-account-create-update-fpfdr"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.059350 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-j9kwf"] Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.087877 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258d7844-9a92-460a-a768-a5dca2fb5db9" path="/var/lib/kubelet/pods/258d7844-9a92-460a-a768-a5dca2fb5db9/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.088539 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486b3226-21be-4783-8b29-abaf747a7693" path="/var/lib/kubelet/pods/486b3226-21be-4783-8b29-abaf747a7693/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.089113 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58599ed2-6176-4003-8bdc-2a1d805da51f" path="/var/lib/kubelet/pods/58599ed2-6176-4003-8bdc-2a1d805da51f/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.089745 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6195b8a8-c8aa-4d92-b58b-066a2df99bd3" path="/var/lib/kubelet/pods/6195b8a8-c8aa-4d92-b58b-066a2df99bd3/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.090773 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="698612ed-a736-4d3d-9a0e-4c75fdd1400f" path="/var/lib/kubelet/pods/698612ed-a736-4d3d-9a0e-4c75fdd1400f/volumes" Mar 10 09:31:48 crc kubenswrapper[4883]: I0310 09:31:48.091317 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b33a38-a6de-4d7e-b24d-ecf5f95f6c34" path="/var/lib/kubelet/pods/84b33a38-a6de-4d7e-b24d-ecf5f95f6c34/volumes" Mar 10 09:31:54 crc kubenswrapper[4883]: I0310 09:31:54.085687 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:31:54 crc kubenswrapper[4883]: E0310 09:31:54.086765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.137743 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.138942 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.138957 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.138974 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-content" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.138982 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-content" Mar 10 09:32:00 crc kubenswrapper[4883]: E0310 09:32:00.139004 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-utilities" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139010 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="extract-utilities" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139242 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7873bd43-b295-4b71-bc81-1d7c3a894778" containerName="registry-server" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.139982 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142234 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142498 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.142625 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.152331 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.177816 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.280401 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.300194 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"auto-csr-approver-29552252-vcsb9\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.459307 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.846908 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:32:00 crc kubenswrapper[4883]: I0310 09:32:00.883130 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerStarted","Data":"bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004"} Mar 10 09:32:01 crc kubenswrapper[4883]: I0310 09:32:01.894710 4883 generic.go:334] "Generic (PLEG): container finished" podID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerID="6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79" exitCode=0 Mar 10 09:32:01 crc kubenswrapper[4883]: I0310 09:32:01.894787 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerDied","Data":"6e2ca16529cbec88524c07de2c8616688102bc3697fc5f0c4b0e4d88eda0ea79"} Mar 10 09:32:02 crc kubenswrapper[4883]: I0310 09:32:02.905962 4883 generic.go:334] "Generic (PLEG): container finished" podID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerID="76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611" exitCode=0 Mar 10 09:32:02 crc kubenswrapper[4883]: I0310 09:32:02.906082 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerDied","Data":"76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611"} Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.235298 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343716 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343777 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.343806 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") pod \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\" (UID: \"2428d4e5-b48e-45ad-9bfb-711c3b1e8471\") " Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.349204 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd" (OuterVolumeSpecName: "kube-api-access-cqrjd") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "kube-api-access-cqrjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.367647 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.369773 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory" (OuterVolumeSpecName: "inventory") pod "2428d4e5-b48e-45ad-9bfb-711c3b1e8471" (UID: "2428d4e5-b48e-45ad-9bfb-711c3b1e8471"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447003 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447039 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.447054 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqrjd\" (UniqueName: \"kubernetes.io/projected/2428d4e5-b48e-45ad-9bfb-711c3b1e8471-kube-api-access-cqrjd\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.917998 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" event={"ID":"2428d4e5-b48e-45ad-9bfb-711c3b1e8471","Type":"ContainerDied","Data":"d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd"} Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.918033 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.918061 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d875b5a0a6f7adae1cffd30aa5fa08f42f89b6a3458e653c961dbd1e2be218bd" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.987924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:03 crc kubenswrapper[4883]: E0310 09:32:03.988538 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.988561 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.988830 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2428d4e5-b48e-45ad-9bfb-711c3b1e8471" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.989745 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994548 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994586 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994639 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:32:03 crc kubenswrapper[4883]: I0310 09:32:03.994540 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.007168 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.146789 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163517 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163609 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.163760 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.164847 4883 scope.go:117] "RemoveContainer" containerID="067357e3a311966c2851280ac43f9cd60519f964886eed01635d2c8acbc8045b" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.209366 4883 scope.go:117] "RemoveContainer" containerID="86cc309342e04f12de9f243fac1e7adc270651f62f05738383b3854942ebc072" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.238685 4883 scope.go:117] "RemoveContainer" containerID="9b3a01ef455743297929fe3e8d915e6b5c1a6d87ee8313151edd54b3c5c1c1d3" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265259 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") pod \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\" (UID: \"fc74aa89-09d6-4974-a6c1-1642f6ef0a64\") " Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265741 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265793 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.265880 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.270655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg" (OuterVolumeSpecName: "kube-api-access-mltvg") pod "fc74aa89-09d6-4974-a6c1-1642f6ef0a64" (UID: "fc74aa89-09d6-4974-a6c1-1642f6ef0a64"). InnerVolumeSpecName "kube-api-access-mltvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.271355 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.271351 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.272043 4883 scope.go:117] "RemoveContainer" containerID="e7877e4d896a5e48fb94d0bb9e636d179a97dbbe531d524d3bf059533ec08d74" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.281774 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.306041 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.338104 4883 scope.go:117] "RemoveContainer" containerID="6677f5c2edc8cf5df63041699d2713762ffd5b4bdf18bb3f374e397d55004166" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.358711 4883 scope.go:117] "RemoveContainer" containerID="b981b386d21855c9b21b1262acdcccebfb4995ef8da840373e95a5a29e03699c" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.368504 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mltvg\" (UniqueName: \"kubernetes.io/projected/fc74aa89-09d6-4974-a6c1-1642f6ef0a64-kube-api-access-mltvg\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.781888 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm"] Mar 10 09:32:04 crc kubenswrapper[4883]: W0310 09:32:04.786004 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ddb6af_f2c7_46eb_aac4_fe69996caf27.slice/crio-1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829 WatchSource:0}: Error finding container 1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829: Status 404 returned error can't find the container with id 1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829 Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927851 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" event={"ID":"fc74aa89-09d6-4974-a6c1-1642f6ef0a64","Type":"ContainerDied","Data":"bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004"} Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927899 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-vcsb9" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.927922 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd31784591c30d4ac2e343f3fe05ea06508177f3dad3922412f06d03f3987004" Mar 10 09:32:04 crc kubenswrapper[4883]: I0310 09:32:04.930502 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerStarted","Data":"1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829"} Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.216712 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.224993 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-r4bxm"] Mar 10 09:32:05 crc kubenswrapper[4883]: I0310 09:32:05.942351 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerStarted","Data":"2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b"} Mar 10 09:32:06 crc kubenswrapper[4883]: I0310 09:32:06.089677 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfbbeba-ae1f-4e53-ba68-3cc981395803" path="/var/lib/kubelet/pods/bcfbbeba-ae1f-4e53-ba68-3cc981395803/volumes" Mar 10 09:32:07 crc kubenswrapper[4883]: I0310 09:32:07.080711 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:07 crc kubenswrapper[4883]: E0310 09:32:07.081268 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.023123 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" podStartSLOduration=6.447872441 podStartE2EDuration="7.023096805s" podCreationTimestamp="2026-03-10 09:32:03 +0000 UTC" firstStartedPulling="2026-03-10 09:32:04.788285587 +0000 UTC m=+1711.043183477" lastFinishedPulling="2026-03-10 09:32:05.363509941 +0000 UTC m=+1711.618407841" observedRunningTime="2026-03-10 09:32:05.979844162 +0000 UTC m=+1712.234742051" watchObservedRunningTime="2026-03-10 09:32:10.023096805 +0000 UTC m=+1716.277994695" Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.032820 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.039980 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cdc4t"] Mar 10 09:32:10 crc kubenswrapper[4883]: I0310 09:32:10.092501 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d523ed0-183e-4bec-a110-fe622b69ef79" path="/var/lib/kubelet/pods/5d523ed0-183e-4bec-a110-fe622b69ef79/volumes" Mar 10 09:32:13 crc kubenswrapper[4883]: I0310 09:32:13.029561 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:32:13 crc kubenswrapper[4883]: I0310 09:32:13.035076 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xtrqg"] Mar 10 09:32:14 crc kubenswrapper[4883]: I0310 09:32:14.092842 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5485539-c722-477d-b595-649e07eac50e" path="/var/lib/kubelet/pods/d5485539-c722-477d-b595-649e07eac50e/volumes" Mar 10 09:32:19 crc kubenswrapper[4883]: I0310 09:32:19.079871 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:19 crc kubenswrapper[4883]: E0310 09:32:19.080935 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.046227 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.052738 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.057937 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-27c8-account-create-update-9w2q5"] Mar 10 09:32:25 crc kubenswrapper[4883]: I0310 09:32:25.063032 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ef9a-account-create-update-4bwrd"] Mar 10 09:32:26 crc kubenswrapper[4883]: I0310 09:32:26.089939 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c357ec28-9cec-42e8-9e4d-dc1fb9960bc7" path="/var/lib/kubelet/pods/c357ec28-9cec-42e8-9e4d-dc1fb9960bc7/volumes" Mar 10 09:32:26 crc kubenswrapper[4883]: I0310 09:32:26.090604 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed176738-d518-45e3-be47-3ace090d0e7a" path="/var/lib/kubelet/pods/ed176738-d518-45e3-be47-3ace090d0e7a/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.027716 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.034878 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.040535 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.045437 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8664j"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.050433 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.055730 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wzxkv"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.060839 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hrq22"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.065998 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-fca8-account-create-update-7jkwx"] Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.091813 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a8b78f-e864-49d5-9dfb-aebd86741885" path="/var/lib/kubelet/pods/07a8b78f-e864-49d5-9dfb-aebd86741885/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.092387 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24713bd6-5868-43ec-94ec-2371a49a0b88" path="/var/lib/kubelet/pods/24713bd6-5868-43ec-94ec-2371a49a0b88/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.092959 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94df275b-e089-4e1f-8eac-e4806d2f1178" path="/var/lib/kubelet/pods/94df275b-e089-4e1f-8eac-e4806d2f1178/volumes" Mar 10 09:32:28 crc kubenswrapper[4883]: I0310 09:32:28.093542 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7aa202-c734-4333-a1de-1bdb39d59804" path="/var/lib/kubelet/pods/ed7aa202-c734-4333-a1de-1bdb39d59804/volumes" Mar 10 09:32:31 crc kubenswrapper[4883]: I0310 09:32:31.026527 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:32:31 crc kubenswrapper[4883]: I0310 09:32:31.032725 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-w5q6s"] Mar 10 09:32:32 crc kubenswrapper[4883]: I0310 09:32:32.093794 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96942836-243a-48c5-be3d-5eb5e5f166d0" path="/var/lib/kubelet/pods/96942836-243a-48c5-be3d-5eb5e5f166d0/volumes" Mar 10 09:32:33 crc kubenswrapper[4883]: I0310 09:32:33.079811 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:33 crc kubenswrapper[4883]: E0310 09:32:33.080346 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:45 crc kubenswrapper[4883]: I0310 09:32:45.079715 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:45 crc kubenswrapper[4883]: E0310 09:32:45.080321 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.034834 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.040628 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kcmgr"] Mar 10 09:32:54 crc kubenswrapper[4883]: I0310 09:32:54.138705 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a25758-cb77-448b-a856-3dbc6df2bc21" path="/var/lib/kubelet/pods/f5a25758-cb77-448b-a856-3dbc6df2bc21/volumes" Mar 10 09:32:57 crc kubenswrapper[4883]: I0310 09:32:57.080062 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:32:57 crc kubenswrapper[4883]: E0310 09:32:57.080434 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:32:58 crc kubenswrapper[4883]: I0310 09:32:58.419095 4883 generic.go:334] "Generic (PLEG): container finished" podID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerID="2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b" exitCode=0 Mar 10 09:32:58 crc kubenswrapper[4883]: I0310 09:32:58.419186 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerDied","Data":"2bfb6eb0197869370269e475b7e819de3708dd20b2a400d6accc4988b6fa951b"} Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.764387 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823095 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823591 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.823745 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") pod \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\" (UID: \"07ddb6af-f2c7-46eb-aac4-fe69996caf27\") " Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.828854 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb" (OuterVolumeSpecName: "kube-api-access-8dmqb") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "kube-api-access-8dmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.856690 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory" (OuterVolumeSpecName: "inventory") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.859023 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "07ddb6af-f2c7-46eb-aac4-fe69996caf27" (UID: "07ddb6af-f2c7-46eb-aac4-fe69996caf27"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.925949 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dmqb\" (UniqueName: \"kubernetes.io/projected/07ddb6af-f2c7-46eb-aac4-fe69996caf27-kube-api-access-8dmqb\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.925991 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:59 crc kubenswrapper[4883]: I0310 09:32:59.926005 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/07ddb6af-f2c7-46eb-aac4-fe69996caf27-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" event={"ID":"07ddb6af-f2c7-46eb-aac4-fe69996caf27","Type":"ContainerDied","Data":"1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829"} Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439833 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a759c64eb7a3ebc65aa948dfaa6cb205a6cdae4383067868312470bfcf3e829" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.439844 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.512596 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:00 crc kubenswrapper[4883]: E0310 09:33:00.513129 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513158 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: E0310 09:33:00.513174 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513183 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513375 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ddb6af-f2c7-46eb-aac4-fe69996caf27" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.513409 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" containerName="oc" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.514116 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.515861 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519402 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519594 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.519687 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.521540 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.638996 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.639059 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.639356 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741243 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741703 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.741847 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.748085 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.748395 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.757693 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:00 crc kubenswrapper[4883]: I0310 09:33:00.828233 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:01 crc kubenswrapper[4883]: I0310 09:33:01.293394 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp"] Mar 10 09:33:01 crc kubenswrapper[4883]: I0310 09:33:01.448078 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerStarted","Data":"8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c"} Mar 10 09:33:02 crc kubenswrapper[4883]: I0310 09:33:02.475357 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerStarted","Data":"27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079"} Mar 10 09:33:02 crc kubenswrapper[4883]: I0310 09:33:02.495704 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" podStartSLOduration=1.686731567 podStartE2EDuration="2.495683475s" podCreationTimestamp="2026-03-10 09:33:00 +0000 UTC" firstStartedPulling="2026-03-10 09:33:01.296336013 +0000 UTC m=+1767.551233902" lastFinishedPulling="2026-03-10 09:33:02.10528792 +0000 UTC m=+1768.360185810" observedRunningTime="2026-03-10 09:33:02.490818381 +0000 UTC m=+1768.745716270" watchObservedRunningTime="2026-03-10 09:33:02.495683475 +0000 UTC m=+1768.750581364" Mar 10 09:33:03 crc kubenswrapper[4883]: I0310 09:33:03.025941 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:33:03 crc kubenswrapper[4883]: I0310 09:33:03.032165 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jt8bs"] Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.091790 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d78560f-1b01-4ac1-9c36-109595422d78" path="/var/lib/kubelet/pods/6d78560f-1b01-4ac1-9c36-109595422d78/volumes" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.476632 4883 scope.go:117] "RemoveContainer" containerID="655c18b53a1ce432bb921c39b01b2479f6ad37de70ab46a32b33f42c98fb125b" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.506521 4883 scope.go:117] "RemoveContainer" containerID="9f05adebe53489f83df9e03cf5da9583790650f545f7218c9e2d571583c52501" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.536134 4883 scope.go:117] "RemoveContainer" containerID="a48c527e869a78aa5301ce2ab9632963d3e2d800250d247df83963b7da9be724" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.570073 4883 scope.go:117] "RemoveContainer" containerID="affbbf9bc93bb1cdc534fd16ed32d4696b867f4c70c0f6fa49bc5b18c4e55f72" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.593730 4883 scope.go:117] "RemoveContainer" containerID="c7587acba5dab37b49dbdd81924e01184e73978fd599f62b1af6671e7ae50b6e" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.620773 4883 scope.go:117] "RemoveContainer" containerID="a4591a3c1d7b1279937a98c799c83abe99429a53d01680d0592b50d1d359cd42" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.654442 4883 scope.go:117] "RemoveContainer" containerID="50e03d7f0d395af101a158c665d03ed09d9fb01ebb9c16be32b6ee2d458d2bcb" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.702795 4883 scope.go:117] "RemoveContainer" containerID="e71d6d0bfc1a16e1dac8ed8107010142581b341f08eea68db466ac02a741d979" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.738905 4883 scope.go:117] "RemoveContainer" containerID="8b81faa071a739cf8a7f25085f6d2124f3dcc3e17601b69b578e8e6f428069ce" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.758180 4883 scope.go:117] "RemoveContainer" containerID="98460e32a504c0e3ede8a9fd544c2c34e4954a1cfed507bb532c53cf560762fd" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.775660 4883 scope.go:117] "RemoveContainer" containerID="f847a3856c40dae8ee7ad4ac93ecaf18ffea7cc99907753e085972a5a353218f" Mar 10 09:33:04 crc kubenswrapper[4883]: I0310 09:33:04.807738 4883 scope.go:117] "RemoveContainer" containerID="5dba08c9d93be005c0c85060006f6110a86c429508b6e36e94151d58e533d961" Mar 10 09:33:06 crc kubenswrapper[4883]: I0310 09:33:06.516929 4883 generic.go:334] "Generic (PLEG): container finished" podID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerID="27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079" exitCode=0 Mar 10 09:33:06 crc kubenswrapper[4883]: I0310 09:33:06.516977 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerDied","Data":"27e15d8be705ee66290bb5461ef69b684491fbcd39e4ef8202689d4194ff9079"} Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.878273 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.897825 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.898852 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.898894 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") pod \"20e06399-dd26-4a60-a6b7-261cc4505a92\" (UID: \"20e06399-dd26-4a60-a6b7-261cc4505a92\") " Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.924467 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2" (OuterVolumeSpecName: "kube-api-access-zrnx2") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "kube-api-access-zrnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:33:07 crc kubenswrapper[4883]: I0310 09:33:07.978595 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.003774 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrnx2\" (UniqueName: \"kubernetes.io/projected/20e06399-dd26-4a60-a6b7-261cc4505a92-kube-api-access-zrnx2\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.003813 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.038638 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory" (OuterVolumeSpecName: "inventory") pod "20e06399-dd26-4a60-a6b7-261cc4505a92" (UID: "20e06399-dd26-4a60-a6b7-261cc4505a92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.106698 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20e06399-dd26-4a60-a6b7-261cc4505a92-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538451 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" event={"ID":"20e06399-dd26-4a60-a6b7-261cc4505a92","Type":"ContainerDied","Data":"8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c"} Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538538 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b05e77e3c54913dfd6f3cc0d546e79ac9a8a73933f4f22991d74e31f203e70c" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.538552 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589067 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:08 crc kubenswrapper[4883]: E0310 09:33:08.589500 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589521 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.589697 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e06399-dd26-4a60-a6b7-261cc4505a92" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.590281 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592442 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592466 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.592791 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.593119 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.597343 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618250 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618388 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.618534 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720747 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720823 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.720907 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.725315 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.725433 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.735576 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kglh5\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:08 crc kubenswrapper[4883]: I0310 09:33:08.903574 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.036181 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.043905 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n7f74"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.378143 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5"] Mar 10 09:33:09 crc kubenswrapper[4883]: I0310 09:33:09.550572 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerStarted","Data":"5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe"} Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.089769 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5acdd73a-9879-4507-8f6d-10e2ad8065e4" path="/var/lib/kubelet/pods/5acdd73a-9879-4507-8f6d-10e2ad8065e4/volumes" Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.561455 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerStarted","Data":"1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592"} Mar 10 09:33:10 crc kubenswrapper[4883]: I0310 09:33:10.579155 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" podStartSLOduration=2.068709002 podStartE2EDuration="2.57912773s" podCreationTimestamp="2026-03-10 09:33:08 +0000 UTC" firstStartedPulling="2026-03-10 09:33:09.384962892 +0000 UTC m=+1775.639860770" lastFinishedPulling="2026-03-10 09:33:09.895381609 +0000 UTC m=+1776.150279498" observedRunningTime="2026-03-10 09:33:10.573359844 +0000 UTC m=+1776.828257734" watchObservedRunningTime="2026-03-10 09:33:10.57912773 +0000 UTC m=+1776.834025619" Mar 10 09:33:11 crc kubenswrapper[4883]: I0310 09:33:11.079858 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:11 crc kubenswrapper[4883]: E0310 09:33:11.080358 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:19 crc kubenswrapper[4883]: I0310 09:33:19.051107 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:33:19 crc kubenswrapper[4883]: I0310 09:33:19.061573 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v9wqz"] Mar 10 09:33:20 crc kubenswrapper[4883]: I0310 09:33:20.090851 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78bfcd03-74e4-4238-ae81-043bc04105cd" path="/var/lib/kubelet/pods/78bfcd03-74e4-4238-ae81-043bc04105cd/volumes" Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.037703 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.044064 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-x2hf5"] Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.084963 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:24 crc kubenswrapper[4883]: E0310 09:33:24.085290 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:24 crc kubenswrapper[4883]: I0310 09:33:24.088936 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0b1d9d-7834-473a-a487-6f540c606706" path="/var/lib/kubelet/pods/dc0b1d9d-7834-473a-a487-6f540c606706/volumes" Mar 10 09:33:37 crc kubenswrapper[4883]: I0310 09:33:37.080185 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:37 crc kubenswrapper[4883]: E0310 09:33:37.080946 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:38 crc kubenswrapper[4883]: I0310 09:33:38.809489 4883 generic.go:334] "Generic (PLEG): container finished" podID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerID="1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592" exitCode=0 Mar 10 09:33:38 crc kubenswrapper[4883]: I0310 09:33:38.809572 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerDied","Data":"1e093362aedc93f97368ec66a4b731148badab0e6dd22037fa34966ee5d3c592"} Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.279959 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.383809 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.384014 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.384141 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") pod \"361b2613-f26e-45c3-aabe-9a0f115e8e10\" (UID: \"361b2613-f26e-45c3-aabe-9a0f115e8e10\") " Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.388984 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4" (OuterVolumeSpecName: "kube-api-access-nn8s4") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "kube-api-access-nn8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.409043 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory" (OuterVolumeSpecName: "inventory") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.410788 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "361b2613-f26e-45c3-aabe-9a0f115e8e10" (UID: "361b2613-f26e-45c3-aabe-9a0f115e8e10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485878 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485907 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361b2613-f26e-45c3-aabe-9a0f115e8e10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.485919 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn8s4\" (UniqueName: \"kubernetes.io/projected/361b2613-f26e-45c3-aabe-9a0f115e8e10-kube-api-access-nn8s4\") on node \"crc\" DevicePath \"\"" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829361 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" event={"ID":"361b2613-f26e-45c3-aabe-9a0f115e8e10","Type":"ContainerDied","Data":"5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe"} Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829435 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c356be1c10709502948cd7744788e12bf33b0f2417e3251b6ef4163577344fe" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.829434 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kglh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.896010 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:40 crc kubenswrapper[4883]: E0310 09:33:40.896875 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.896898 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.897125 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="361b2613-f26e-45c3-aabe-9a0f115e8e10" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.897879 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900094 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900519 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.900657 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.901440 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.904872 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.996986 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.997070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:40 crc kubenswrapper[4883]: I0310 09:33:40.997158 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.100237 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.101110 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.101559 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.105232 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.106052 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.118021 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.212912 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.676960 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh"] Mar 10 09:33:41 crc kubenswrapper[4883]: I0310 09:33:41.838746 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerStarted","Data":"8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07"} Mar 10 09:33:42 crc kubenswrapper[4883]: I0310 09:33:42.856939 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerStarted","Data":"7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784"} Mar 10 09:33:42 crc kubenswrapper[4883]: I0310 09:33:42.879329 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" podStartSLOduration=2.127707916 podStartE2EDuration="2.879312129s" podCreationTimestamp="2026-03-10 09:33:40 +0000 UTC" firstStartedPulling="2026-03-10 09:33:41.681365831 +0000 UTC m=+1807.936263710" lastFinishedPulling="2026-03-10 09:33:42.432970033 +0000 UTC m=+1808.687867923" observedRunningTime="2026-03-10 09:33:42.86928366 +0000 UTC m=+1809.124181549" watchObservedRunningTime="2026-03-10 09:33:42.879312129 +0000 UTC m=+1809.134210008" Mar 10 09:33:48 crc kubenswrapper[4883]: I0310 09:33:48.080565 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:33:48 crc kubenswrapper[4883]: E0310 09:33:48.081700 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:33:59 crc kubenswrapper[4883]: I0310 09:33:59.034548 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:33:59 crc kubenswrapper[4883]: I0310 09:33:59.039104 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-l9ldx"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.022559 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.030838 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f74b-account-create-update-lsxls"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.091680 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d355ddcd-9120-4436-84c4-928027e6ee33" path="/var/lib/kubelet/pods/d355ddcd-9120-4436-84c4-928027e6ee33/volumes" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.092298 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e694b4cb-0aa6-46d5-b6be-039d6a92e4a8" path="/var/lib/kubelet/pods/e694b4cb-0aa6-46d5-b6be-039d6a92e4a8/volumes" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.144317 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.146174 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.148778 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.148835 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.152711 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.154628 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.321984 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.425126 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.444711 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"auto-csr-approver-29552254-d9q7p\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.470812 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:00 crc kubenswrapper[4883]: I0310 09:34:00.878712 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.012990 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerStarted","Data":"9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc"} Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.045427 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.057543 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.068062 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.076972 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.096064 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zr486"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.106982 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-15d6-account-create-update-lwjcj"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.113983 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4vxd6"] Mar 10 09:34:01 crc kubenswrapper[4883]: I0310 09:34:01.119112 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c052-account-create-update-hg4pd"] Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.089725 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39def71-60ef-4b2a-823b-1c5e89e02647" path="/var/lib/kubelet/pods/e39def71-60ef-4b2a-823b-1c5e89e02647/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.090577 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9dd286b-6aa5-4525-a645-8e4ec79af348" path="/var/lib/kubelet/pods/e9dd286b-6aa5-4525-a645-8e4ec79af348/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.091043 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9eca221-08d6-4b22-8d5c-1cd9e95c65d9" path="/var/lib/kubelet/pods/f9eca221-08d6-4b22-8d5c-1cd9e95c65d9/volumes" Mar 10 09:34:02 crc kubenswrapper[4883]: I0310 09:34:02.091532 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbd0859-6f93-4118-9e5b-2170ec3d43ad" path="/var/lib/kubelet/pods/fdbd0859-6f93-4118-9e5b-2170ec3d43ad/volumes" Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.030942 4883 generic.go:334] "Generic (PLEG): container finished" podID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerID="6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8" exitCode=0 Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.030993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerDied","Data":"6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8"} Mar 10 09:34:03 crc kubenswrapper[4883]: I0310 09:34:03.079541 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:03 crc kubenswrapper[4883]: E0310 09:34:03.079908 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.327023 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.403295 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") pod \"b477b90a-75af-4621-8c33-21fdd8c9c749\" (UID: \"b477b90a-75af-4621-8c33-21fdd8c9c749\") " Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.409653 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l" (OuterVolumeSpecName: "kube-api-access-lds9l") pod "b477b90a-75af-4621-8c33-21fdd8c9c749" (UID: "b477b90a-75af-4621-8c33-21fdd8c9c749"). InnerVolumeSpecName "kube-api-access-lds9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:04 crc kubenswrapper[4883]: I0310 09:34:04.506337 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lds9l\" (UniqueName: \"kubernetes.io/projected/b477b90a-75af-4621-8c33-21fdd8c9c749-kube-api-access-lds9l\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.003591 4883 scope.go:117] "RemoveContainer" containerID="d23bd9bd934e3d99aa1bfc6ccd981d2c32f179368b140bc3665f8538d2c19637" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.026742 4883 scope.go:117] "RemoveContainer" containerID="447fb8e21bbb8bc037b2913c14d7e94664156b41e117d80c9e5a7c41f589d745" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.050838 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" event={"ID":"b477b90a-75af-4621-8c33-21fdd8c9c749","Type":"ContainerDied","Data":"9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc"} Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.050887 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c061b891dab6d5107f249b527e3bc76699e556e53745a31ae035535d96ba2dc" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.051214 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-d9q7p" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.070993 4883 scope.go:117] "RemoveContainer" containerID="edddf942ff54cf02d31c8d37d1a93a850752455b76c3f9b8d5acabfd5e985820" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.093295 4883 scope.go:117] "RemoveContainer" containerID="e1764b4c527a37975cc4e8ef8e724e3a447bf2698dc21042ec02c05dc2aa830c" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.127345 4883 scope.go:117] "RemoveContainer" containerID="d00a8e52c275926dd98e27f733cab6f0434bb178955532902b15a5e7ab5d08f8" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.146920 4883 scope.go:117] "RemoveContainer" containerID="911a3a55df750ade4baf157ba45ca477442b34e3cdb85095ba2fd8c7ca80b8fe" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.164297 4883 scope.go:117] "RemoveContainer" containerID="0058e7bce226cca970138ded52e080821de3de3f9381dfb13e4eee1dea226cbf" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.195665 4883 scope.go:117] "RemoveContainer" containerID="607df7cc400848b9303988905dd099e3a6ed13423fde585243a8fdb269315664" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.232026 4883 scope.go:117] "RemoveContainer" containerID="ca65d438bf33bf28ae2760bd85500809892b5530a19b9249afc6d3b1e6b1e723" Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.388634 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:34:05 crc kubenswrapper[4883]: I0310 09:34:05.396232 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-gvmzc"] Mar 10 09:34:06 crc kubenswrapper[4883]: I0310 09:34:06.089339 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce" path="/var/lib/kubelet/pods/804b1fd1-8bcc-4d7b-9ad7-0c8b5fbe49ce/volumes" Mar 10 09:34:15 crc kubenswrapper[4883]: I0310 09:34:15.080443 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:15 crc kubenswrapper[4883]: E0310 09:34:15.081385 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:19 crc kubenswrapper[4883]: I0310 09:34:19.179511 4883 generic.go:334] "Generic (PLEG): container finished" podID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerID="7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784" exitCode=0 Mar 10 09:34:19 crc kubenswrapper[4883]: I0310 09:34:19.179588 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerDied","Data":"7062c75f5b57df7b7f1570cd08c40c2a3f71aa8c69bb0f4ca90d1b87f910e784"} Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.542844 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727087 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727341 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.727638 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") pod \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\" (UID: \"269dd9c8-3d75-4892-9f75-c4fe1b9093b8\") " Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.733625 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt" (OuterVolumeSpecName: "kube-api-access-djdnt") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "kube-api-access-djdnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.753834 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory" (OuterVolumeSpecName: "inventory") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.754435 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "269dd9c8-3d75-4892-9f75-c4fe1b9093b8" (UID: "269dd9c8-3d75-4892-9f75-c4fe1b9093b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831154 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdnt\" (UniqueName: \"kubernetes.io/projected/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-kube-api-access-djdnt\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831194 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:20 crc kubenswrapper[4883]: I0310 09:34:20.831210 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/269dd9c8-3d75-4892-9f75-c4fe1b9093b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.209799 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" event={"ID":"269dd9c8-3d75-4892-9f75-c4fe1b9093b8","Type":"ContainerDied","Data":"8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07"} Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.211811 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8303a8bf083bd2d991e220d511db2004161e87ee602ae2da6f0bac8a22dc1f07" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.211957 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.268709 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:21 crc kubenswrapper[4883]: E0310 09:34:21.269238 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269260 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: E0310 09:34:21.269319 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269327 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269590 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="269dd9c8-3d75-4892-9f75-c4fe1b9093b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.269621 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" containerName="oc" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.270523 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.272975 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273148 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273175 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.273176 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.278152 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344638 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.344876 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447294 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447490 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.447639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.452378 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.453936 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.464320 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"ssh-known-hosts-edpm-deployment-v5v84\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:21 crc kubenswrapper[4883]: I0310 09:34:21.584083 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:22 crc kubenswrapper[4883]: I0310 09:34:22.050970 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-v5v84"] Mar 10 09:34:22 crc kubenswrapper[4883]: I0310 09:34:22.219177 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerStarted","Data":"aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166"} Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.042998 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.052998 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pghj7"] Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.231747 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerStarted","Data":"fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333"} Mar 10 09:34:23 crc kubenswrapper[4883]: I0310 09:34:23.256943 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" podStartSLOduration=1.626030222 podStartE2EDuration="2.256921721s" podCreationTimestamp="2026-03-10 09:34:21 +0000 UTC" firstStartedPulling="2026-03-10 09:34:22.054412209 +0000 UTC m=+1848.309310098" lastFinishedPulling="2026-03-10 09:34:22.685303708 +0000 UTC m=+1848.940201597" observedRunningTime="2026-03-10 09:34:23.247361505 +0000 UTC m=+1849.502259393" watchObservedRunningTime="2026-03-10 09:34:23.256921721 +0000 UTC m=+1849.511819610" Mar 10 09:34:24 crc kubenswrapper[4883]: I0310 09:34:24.089741 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c8e962-9007-49e1-bd9f-d822e9100291" path="/var/lib/kubelet/pods/46c8e962-9007-49e1-bd9f-d822e9100291/volumes" Mar 10 09:34:28 crc kubenswrapper[4883]: I0310 09:34:28.274365 4883 generic.go:334] "Generic (PLEG): container finished" podID="caa69332-97ab-4629-900f-1596af363ba4" containerID="fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333" exitCode=0 Mar 10 09:34:28 crc kubenswrapper[4883]: I0310 09:34:28.274462 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerDied","Data":"fb01332e612ee8643c9cd5071e07447d36df50055a15b4afd7c545e1c1d03333"} Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.617873 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805453 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805760 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.805855 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") pod \"caa69332-97ab-4629-900f-1596af363ba4\" (UID: \"caa69332-97ab-4629-900f-1596af363ba4\") " Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.811775 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh" (OuterVolumeSpecName: "kube-api-access-nhnxh") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "kube-api-access-nhnxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.829675 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.831866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "caa69332-97ab-4629-900f-1596af363ba4" (UID: "caa69332-97ab-4629-900f-1596af363ba4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.908957 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.908990 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnxh\" (UniqueName: \"kubernetes.io/projected/caa69332-97ab-4629-900f-1596af363ba4-kube-api-access-nhnxh\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:29 crc kubenswrapper[4883]: I0310 09:34:29.909001 4883 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/caa69332-97ab-4629-900f-1596af363ba4-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.079869 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:30 crc kubenswrapper[4883]: E0310 09:34:30.080331 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.292949 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" event={"ID":"caa69332-97ab-4629-900f-1596af363ba4","Type":"ContainerDied","Data":"aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166"} Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.293003 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa00dd959f63d750c79bf5b931b12bc09736bfd10221fd826891011af80ca166" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.293005 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-v5v84" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.358659 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:30 crc kubenswrapper[4883]: E0310 09:34:30.359058 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359080 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359264 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa69332-97ab-4629-900f-1596af363ba4" containerName="ssh-known-hosts-edpm-deployment" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.359895 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362319 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362547 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.362736 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.363684 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.373981 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418257 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418348 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.418570 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519892 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.519944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.525161 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.525964 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.536866 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rlqjc\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:30 crc kubenswrapper[4883]: I0310 09:34:30.673955 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:31 crc kubenswrapper[4883]: I0310 09:34:31.133546 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc"] Mar 10 09:34:31 crc kubenswrapper[4883]: I0310 09:34:31.301417 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerStarted","Data":"d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970"} Mar 10 09:34:32 crc kubenswrapper[4883]: I0310 09:34:32.310941 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerStarted","Data":"4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933"} Mar 10 09:34:32 crc kubenswrapper[4883]: I0310 09:34:32.341741 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" podStartSLOduration=1.846960975 podStartE2EDuration="2.341715335s" podCreationTimestamp="2026-03-10 09:34:30 +0000 UTC" firstStartedPulling="2026-03-10 09:34:31.14128878 +0000 UTC m=+1857.396186669" lastFinishedPulling="2026-03-10 09:34:31.63604314 +0000 UTC m=+1857.890941029" observedRunningTime="2026-03-10 09:34:32.323316562 +0000 UTC m=+1858.578214451" watchObservedRunningTime="2026-03-10 09:34:32.341715335 +0000 UTC m=+1858.596613224" Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.024621 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.030284 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s66f5"] Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.087443 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef281d66-a4a1-4a1a-b9d7-d6265b46ca05" path="/var/lib/kubelet/pods/ef281d66-a4a1-4a1a-b9d7-d6265b46ca05/volumes" Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.361208 4883 generic.go:334] "Generic (PLEG): container finished" podID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerID="4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933" exitCode=0 Mar 10 09:34:38 crc kubenswrapper[4883]: I0310 09:34:38.361260 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerDied","Data":"4c819d4de3b0b257ea9ca3015be282299cdfba8979a472bd1fbf9b9d67be7933"} Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.700879 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899260 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899391 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.899508 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") pod \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\" (UID: \"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b\") " Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.905119 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4" (OuterVolumeSpecName: "kube-api-access-swnx4") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "kube-api-access-swnx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.921095 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:39 crc kubenswrapper[4883]: I0310 09:34:39.921570 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory" (OuterVolumeSpecName: "inventory") pod "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" (UID: "61bb4cc5-1d4f-4439-a00e-4b2e27d4802b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002066 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002098 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnx4\" (UniqueName: \"kubernetes.io/projected/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-kube-api-access-swnx4\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.002109 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61bb4cc5-1d4f-4439-a00e-4b2e27d4802b-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.024043 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.029895 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rptkb"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.089388 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3a7934-1ab2-4013-b3ff-90859ffcc179" path="/var/lib/kubelet/pods/3d3a7934-1ab2-4013-b3ff-90859ffcc179/volumes" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377159 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" event={"ID":"61bb4cc5-1d4f-4439-a00e-4b2e27d4802b","Type":"ContainerDied","Data":"d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970"} Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377212 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5f0e6ed073a3d1de2b6b73203aecce347baf4d6a677d985db4e33c19ce15970" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.377268 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rlqjc" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432069 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:40 crc kubenswrapper[4883]: E0310 09:34:40.432671 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432691 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.432932 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="61bb4cc5-1d4f-4439-a00e-4b2e27d4802b" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.433730 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.437993 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.438106 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.438943 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.439170 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.439222 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612597 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.612661 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714054 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714131 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.714187 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.719164 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.719788 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.728950 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:40 crc kubenswrapper[4883]: I0310 09:34:40.752988 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.080149 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:41 crc kubenswrapper[4883]: E0310 09:34:41.080422 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.208613 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz"] Mar 10 09:34:41 crc kubenswrapper[4883]: I0310 09:34:41.387995 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerStarted","Data":"16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6"} Mar 10 09:34:42 crc kubenswrapper[4883]: I0310 09:34:42.396954 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerStarted","Data":"73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe"} Mar 10 09:34:42 crc kubenswrapper[4883]: I0310 09:34:42.414108 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" podStartSLOduration=1.9080569760000001 podStartE2EDuration="2.414088131s" podCreationTimestamp="2026-03-10 09:34:40 +0000 UTC" firstStartedPulling="2026-03-10 09:34:41.213882132 +0000 UTC m=+1867.468780021" lastFinishedPulling="2026-03-10 09:34:41.719913287 +0000 UTC m=+1867.974811176" observedRunningTime="2026-03-10 09:34:42.412450655 +0000 UTC m=+1868.667348543" watchObservedRunningTime="2026-03-10 09:34:42.414088131 +0000 UTC m=+1868.668986021" Mar 10 09:34:49 crc kubenswrapper[4883]: I0310 09:34:49.454319 4883 generic.go:334] "Generic (PLEG): container finished" podID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerID="73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe" exitCode=0 Mar 10 09:34:49 crc kubenswrapper[4883]: I0310 09:34:49.454401 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerDied","Data":"73f1da7ddd7a009338a4315bdefbd88cf56f53350b85e5696a99a4463ab74afe"} Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.761867 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903676 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903737 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.903764 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") pod \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\" (UID: \"0efdf39d-2133-4aaf-9fec-2b50533d3cae\") " Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.910002 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7" (OuterVolumeSpecName: "kube-api-access-wqjd7") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "kube-api-access-wqjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.928740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory" (OuterVolumeSpecName: "inventory") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:50 crc kubenswrapper[4883]: I0310 09:34:50.929253 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0efdf39d-2133-4aaf-9fec-2b50533d3cae" (UID: "0efdf39d-2133-4aaf-9fec-2b50533d3cae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006156 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqjd7\" (UniqueName: \"kubernetes.io/projected/0efdf39d-2133-4aaf-9fec-2b50533d3cae-kube-api-access-wqjd7\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006191 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.006204 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0efdf39d-2133-4aaf-9fec-2b50533d3cae-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.470757 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" event={"ID":"0efdf39d-2133-4aaf-9fec-2b50533d3cae","Type":"ContainerDied","Data":"16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6"} Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.471250 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f2e4b6a9b14352515d1b370e2f4f84e250f6963b28fb04190abdc0181c01d6" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.470950 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.536762 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:51 crc kubenswrapper[4883]: E0310 09:34:51.537154 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.537173 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.537353 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efdf39d-2133-4aaf-9fec-2b50533d3cae" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.538032 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.541555 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.541631 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542047 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542107 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542179 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542183 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542240 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.542239 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.548280 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720198 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720255 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720608 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720689 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720764 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720799 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720871 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720922 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.720952 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721011 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721092 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.721203 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.821967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822004 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822040 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822088 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822116 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822154 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822200 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822247 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822271 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822295 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822317 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822333 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.822363 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.826595 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.826691 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.827014 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.827642 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828074 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828896 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828975 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.828996 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.829055 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830081 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830461 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.830712 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.832030 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.843048 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:51 crc kubenswrapper[4883]: I0310 09:34:51.851638 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:34:52 crc kubenswrapper[4883]: I0310 09:34:52.297422 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9"] Mar 10 09:34:52 crc kubenswrapper[4883]: I0310 09:34:52.481335 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerStarted","Data":"127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be"} Mar 10 09:34:53 crc kubenswrapper[4883]: I0310 09:34:53.491430 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerStarted","Data":"2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8"} Mar 10 09:34:53 crc kubenswrapper[4883]: I0310 09:34:53.519607 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" podStartSLOduration=2.021470864 podStartE2EDuration="2.519580421s" podCreationTimestamp="2026-03-10 09:34:51 +0000 UTC" firstStartedPulling="2026-03-10 09:34:52.303328646 +0000 UTC m=+1878.558226535" lastFinishedPulling="2026-03-10 09:34:52.801438204 +0000 UTC m=+1879.056336092" observedRunningTime="2026-03-10 09:34:53.51062245 +0000 UTC m=+1879.765520340" watchObservedRunningTime="2026-03-10 09:34:53.519580421 +0000 UTC m=+1879.774478310" Mar 10 09:34:55 crc kubenswrapper[4883]: I0310 09:34:55.080598 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:34:55 crc kubenswrapper[4883]: I0310 09:34:55.512817 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.037115 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.043453 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-j9wml"] Mar 10 09:34:56 crc kubenswrapper[4883]: I0310 09:34:56.089200 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf096652-ae85-4c98-8821-cd47eafae98f" path="/var/lib/kubelet/pods/cf096652-ae85-4c98-8821-cd47eafae98f/volumes" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.404360 4883 scope.go:117] "RemoveContainer" containerID="bec7039a9730daef00f6f750a9ad04bbfacd1a82b7aa8fa9152df9b60b9ae6d2" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.444246 4883 scope.go:117] "RemoveContainer" containerID="354a3c28cf873c296da4850a04f59db6b94dbed1b8ba2c5447e9a4525bd40aed" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.479456 4883 scope.go:117] "RemoveContainer" containerID="9cbbbfbaa746758ec604b741e62dd7f69a1ec3688b2c70a13290cb0a5d5252e8" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.511662 4883 scope.go:117] "RemoveContainer" containerID="1b941b684a31273444bcbcfa6618c7c4de8354efb411f75cdb5b7cfd833281c3" Mar 10 09:35:05 crc kubenswrapper[4883]: I0310 09:35:05.557389 4883 scope.go:117] "RemoveContainer" containerID="996ee60e7ca18498ccd88f148bbd61209a4f5cc8500909482eda8135e24f983e" Mar 10 09:35:19 crc kubenswrapper[4883]: I0310 09:35:19.745485 4883 generic.go:334] "Generic (PLEG): container finished" podID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerID="2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8" exitCode=0 Mar 10 09:35:19 crc kubenswrapper[4883]: I0310 09:35:19.745561 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerDied","Data":"2cd02ede66a5fe79061c1a6091f99f4680432d2ba4ea8cd1a7417070b12939f8"} Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.099367 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251202 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251261 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251336 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251368 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251439 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251470 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251537 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251564 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251588 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251614 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251682 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251706 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.251735 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7e9f7531-37e1-4284-94ac-cada3d2fc301\" (UID: \"7e9f7531-37e1-4284-94ac-cada3d2fc301\") " Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.259924 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.260043 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.260975 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.261381 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.261959 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262118 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262220 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262246 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv" (OuterVolumeSpecName: "kube-api-access-t6tzv") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "kube-api-access-t6tzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262508 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262720 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.262839 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.264792 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.281229 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory" (OuterVolumeSpecName: "inventory") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.283737 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e9f7531-37e1-4284-94ac-cada3d2fc301" (UID: "7e9f7531-37e1-4284-94ac-cada3d2fc301"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353521 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353558 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353572 4883 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353583 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353596 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353606 4883 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353617 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6tzv\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-kube-api-access-t6tzv\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353627 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353636 4883 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353657 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353669 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353680 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353692 4883 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e9f7531-37e1-4284-94ac-cada3d2fc301-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.353702 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e9f7531-37e1-4284-94ac-cada3d2fc301-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766389 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" event={"ID":"7e9f7531-37e1-4284-94ac-cada3d2fc301","Type":"ContainerDied","Data":"127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be"} Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766802 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127055f8bdc0041304754e84a763bfae34d394957e46e16446f33cdbc93502be" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.766506 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.910766 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:21 crc kubenswrapper[4883]: E0310 09:35:21.911203 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.911223 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.911381 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e9f7531-37e1-4284-94ac-cada3d2fc301" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.912102 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913540 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913678 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.913852 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.914032 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.915571 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:35:21 crc kubenswrapper[4883]: I0310 09:35:21.919505 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065227 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065334 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065376 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065432 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.065487 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166792 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166845 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166890 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.166950 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.168720 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.172533 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.173737 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.174930 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.184709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7cqkz\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.231937 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.693417 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz"] Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.698279 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:35:22 crc kubenswrapper[4883]: I0310 09:35:22.774983 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerStarted","Data":"27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009"} Mar 10 09:35:23 crc kubenswrapper[4883]: I0310 09:35:23.789190 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerStarted","Data":"1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb"} Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.129556 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" podStartSLOduration=38.593123913 podStartE2EDuration="39.129532487s" podCreationTimestamp="2026-03-10 09:35:21 +0000 UTC" firstStartedPulling="2026-03-10 09:35:22.698043881 +0000 UTC m=+1908.952941771" lastFinishedPulling="2026-03-10 09:35:23.234452456 +0000 UTC m=+1909.489350345" observedRunningTime="2026-03-10 09:35:23.815576056 +0000 UTC m=+1910.070473935" watchObservedRunningTime="2026-03-10 09:36:00.129532487 +0000 UTC m=+1946.384430375" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.138342 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.139836 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.142025 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.143284 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.143426 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.157820 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.335001 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.438262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.458440 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"auto-csr-approver-29552256-fmqzj\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.460134 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:00 crc kubenswrapper[4883]: I0310 09:36:00.858441 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:36:01 crc kubenswrapper[4883]: I0310 09:36:01.120383 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerStarted","Data":"7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2"} Mar 10 09:36:02 crc kubenswrapper[4883]: I0310 09:36:02.131533 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerStarted","Data":"f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff"} Mar 10 09:36:02 crc kubenswrapper[4883]: I0310 09:36:02.157052 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" podStartSLOduration=1.186254833 podStartE2EDuration="2.157035736s" podCreationTimestamp="2026-03-10 09:36:00 +0000 UTC" firstStartedPulling="2026-03-10 09:36:00.861416858 +0000 UTC m=+1947.116314748" lastFinishedPulling="2026-03-10 09:36:01.832197762 +0000 UTC m=+1948.087095651" observedRunningTime="2026-03-10 09:36:02.146124339 +0000 UTC m=+1948.401022229" watchObservedRunningTime="2026-03-10 09:36:02.157035736 +0000 UTC m=+1948.411933625" Mar 10 09:36:03 crc kubenswrapper[4883]: I0310 09:36:03.145672 4883 generic.go:334] "Generic (PLEG): container finished" podID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerID="f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff" exitCode=0 Mar 10 09:36:03 crc kubenswrapper[4883]: I0310 09:36:03.145771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerDied","Data":"f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff"} Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.449714 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.631490 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") pod \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\" (UID: \"c70e8b0b-51ad-4080-8955-8aa8ee68f274\") " Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.637375 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj" (OuterVolumeSpecName: "kube-api-access-6z5bj") pod "c70e8b0b-51ad-4080-8955-8aa8ee68f274" (UID: "c70e8b0b-51ad-4080-8955-8aa8ee68f274"). InnerVolumeSpecName "kube-api-access-6z5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:04 crc kubenswrapper[4883]: I0310 09:36:04.734074 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5bj\" (UniqueName: \"kubernetes.io/projected/c70e8b0b-51ad-4080-8955-8aa8ee68f274-kube-api-access-6z5bj\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170825 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" event={"ID":"c70e8b0b-51ad-4080-8955-8aa8ee68f274","Type":"ContainerDied","Data":"7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2"} Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170885 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f943ddc9f27bb461c68b30cb9b1933997a1a124f599f94f92475364a5e85cb2" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.170891 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552256-fmqzj" Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.217410 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:36:05 crc kubenswrapper[4883]: I0310 09:36:05.223425 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-bzz2p"] Mar 10 09:36:06 crc kubenswrapper[4883]: I0310 09:36:06.097998 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea7fca8-0ec0-44f9-b729-2c150761519f" path="/var/lib/kubelet/pods/aea7fca8-0ec0-44f9-b729-2c150761519f/volumes" Mar 10 09:36:10 crc kubenswrapper[4883]: I0310 09:36:10.216076 4883 generic.go:334] "Generic (PLEG): container finished" podID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerID="1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb" exitCode=0 Mar 10 09:36:10 crc kubenswrapper[4883]: I0310 09:36:10.216191 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerDied","Data":"1d006b59011e7940ba2ae5a4022657be65cc4bdf6115e6c741e5f6a8f0eb1eeb"} Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.587242 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670560 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670810 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.670945 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.671405 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.676090 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.676267 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp" (OuterVolumeSpecName: "kube-api-access-tmfbp") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "kube-api-access-tmfbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: E0310 09:36:11.691557 4883 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam podName:bbcde384-73a5-48c3-a5fb-226d671707cb nodeName:}" failed. No retries permitted until 2026-03-10 09:36:12.191518475 +0000 UTC m=+1958.446416364 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb") : error deleting /var/lib/kubelet/pods/bbcde384-73a5-48c3-a5fb-226d671707cb/volume-subpaths: remove /var/lib/kubelet/pods/bbcde384-73a5-48c3-a5fb-226d671707cb/volume-subpaths: no such file or directory Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.691942 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.692941 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory" (OuterVolumeSpecName: "inventory") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773556 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmfbp\" (UniqueName: \"kubernetes.io/projected/bbcde384-73a5-48c3-a5fb-226d671707cb-kube-api-access-tmfbp\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773584 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773596 4883 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/bbcde384-73a5-48c3-a5fb-226d671707cb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:11 crc kubenswrapper[4883]: I0310 09:36:11.773605 4883 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.233951 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" event={"ID":"bbcde384-73a5-48c3-a5fb-226d671707cb","Type":"ContainerDied","Data":"27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009"} Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.234007 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27fed5b683575c0a08b954a9ae7d46ae8f20e461778e70a56e2493bbed622009" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.234013 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7cqkz" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.280236 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") pod \"bbcde384-73a5-48c3-a5fb-226d671707cb\" (UID: \"bbcde384-73a5-48c3-a5fb-226d671707cb\") " Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.284289 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bbcde384-73a5-48c3-a5fb-226d671707cb" (UID: "bbcde384-73a5-48c3-a5fb-226d671707cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302550 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:12 crc kubenswrapper[4883]: E0310 09:36:12.302943 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302961 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: E0310 09:36:12.302975 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.302982 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303168 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" containerName="oc" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303189 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbcde384-73a5-48c3-a5fb-226d671707cb" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.303882 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.305927 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.306341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.313187 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.383867 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bbcde384-73a5-48c3-a5fb-226d671707cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487141 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487539 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487595 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487631 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487702 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.487744 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589060 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589179 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589218 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589262 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589290 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.589933 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.594863 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.594987 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.596167 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.596709 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.599256 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.605862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:12 crc kubenswrapper[4883]: I0310 09:36:12.632325 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:13 crc kubenswrapper[4883]: I0310 09:36:13.135504 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4"] Mar 10 09:36:13 crc kubenswrapper[4883]: I0310 09:36:13.245836 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerStarted","Data":"1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28"} Mar 10 09:36:14 crc kubenswrapper[4883]: I0310 09:36:14.259936 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerStarted","Data":"5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b"} Mar 10 09:36:14 crc kubenswrapper[4883]: I0310 09:36:14.285468 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" podStartSLOduration=1.7759773540000001 podStartE2EDuration="2.285446886s" podCreationTimestamp="2026-03-10 09:36:12 +0000 UTC" firstStartedPulling="2026-03-10 09:36:13.140961772 +0000 UTC m=+1959.395859651" lastFinishedPulling="2026-03-10 09:36:13.650431293 +0000 UTC m=+1959.905329183" observedRunningTime="2026-03-10 09:36:14.281027982 +0000 UTC m=+1960.535925881" watchObservedRunningTime="2026-03-10 09:36:14.285446886 +0000 UTC m=+1960.540344765" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.480491 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.484156 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488568 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488639 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.488776 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.502920 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594570 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594622 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.594689 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.595257 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.595269 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.613645 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"redhat-marketplace-8j4kp\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:22 crc kubenswrapper[4883]: I0310 09:36:22.804207 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:23 crc kubenswrapper[4883]: I0310 09:36:23.206626 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:23 crc kubenswrapper[4883]: I0310 09:36:23.337635 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerStarted","Data":"390eed8ba1d2bc57af65e38ba2174fcb259566f3e424f5c3de06515e76c15665"} Mar 10 09:36:24 crc kubenswrapper[4883]: I0310 09:36:24.349711 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" exitCode=0 Mar 10 09:36:24 crc kubenswrapper[4883]: I0310 09:36:24.349911 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b"} Mar 10 09:36:25 crc kubenswrapper[4883]: I0310 09:36:25.360833 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" exitCode=0 Mar 10 09:36:25 crc kubenswrapper[4883]: I0310 09:36:25.360925 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8"} Mar 10 09:36:26 crc kubenswrapper[4883]: I0310 09:36:26.373431 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerStarted","Data":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} Mar 10 09:36:26 crc kubenswrapper[4883]: I0310 09:36:26.393631 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8j4kp" podStartSLOduration=2.91504723 podStartE2EDuration="4.393607076s" podCreationTimestamp="2026-03-10 09:36:22 +0000 UTC" firstStartedPulling="2026-03-10 09:36:24.353071568 +0000 UTC m=+1970.607969458" lastFinishedPulling="2026-03-10 09:36:25.831631416 +0000 UTC m=+1972.086529304" observedRunningTime="2026-03-10 09:36:26.388971944 +0000 UTC m=+1972.643869834" watchObservedRunningTime="2026-03-10 09:36:26.393607076 +0000 UTC m=+1972.648504965" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.804857 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.805169 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:32 crc kubenswrapper[4883]: I0310 09:36:32.841331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:33 crc kubenswrapper[4883]: I0310 09:36:33.469586 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:33 crc kubenswrapper[4883]: I0310 09:36:33.519232 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:35 crc kubenswrapper[4883]: I0310 09:36:35.452887 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8j4kp" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" containerID="cri-o://6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" gracePeriod=2 Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.310379 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.372870 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.372974 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.373089 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") pod \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\" (UID: \"a0d40308-0487-45a0-9ebe-8978ccc4b10b\") " Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.373884 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities" (OuterVolumeSpecName: "utilities") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.378692 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z" (OuterVolumeSpecName: "kube-api-access-d2j7z") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "kube-api-access-d2j7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.393788 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0d40308-0487-45a0-9ebe-8978ccc4b10b" (UID: "a0d40308-0487-45a0-9ebe-8978ccc4b10b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466136 4883 generic.go:334] "Generic (PLEG): container finished" podID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" exitCode=0 Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466209 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466228 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j4kp" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466258 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j4kp" event={"ID":"a0d40308-0487-45a0-9ebe-8978ccc4b10b","Type":"ContainerDied","Data":"390eed8ba1d2bc57af65e38ba2174fcb259566f3e424f5c3de06515e76c15665"} Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.466284 4883 scope.go:117] "RemoveContainer" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476747 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2j7z\" (UniqueName: \"kubernetes.io/projected/a0d40308-0487-45a0-9ebe-8978ccc4b10b-kube-api-access-d2j7z\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476774 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.476810 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0d40308-0487-45a0-9ebe-8978ccc4b10b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.492566 4883 scope.go:117] "RemoveContainer" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.495334 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.501461 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j4kp"] Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.515719 4883 scope.go:117] "RemoveContainer" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.546617 4883 scope.go:117] "RemoveContainer" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547072 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": container with ID starting with 6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2 not found: ID does not exist" containerID="6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547107 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2"} err="failed to get container status \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": rpc error: code = NotFound desc = could not find container \"6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2\": container with ID starting with 6fd6952b7cda30d4e703480c2f328f5b52d28df10211792e0e869fb9bfc505a2 not found: ID does not exist" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547132 4883 scope.go:117] "RemoveContainer" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547427 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": container with ID starting with 78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8 not found: ID does not exist" containerID="78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547468 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8"} err="failed to get container status \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": rpc error: code = NotFound desc = could not find container \"78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8\": container with ID starting with 78b0eb578eebed15e85775f7197a7afcfb32f82a62b84666c9d829473263fbf8 not found: ID does not exist" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547517 4883 scope.go:117] "RemoveContainer" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: E0310 09:36:36.547860 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": container with ID starting with 63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b not found: ID does not exist" containerID="63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b" Mar 10 09:36:36 crc kubenswrapper[4883]: I0310 09:36:36.547884 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b"} err="failed to get container status \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": rpc error: code = NotFound desc = could not find container \"63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b\": container with ID starting with 63d8e07204a8b66582ca0cf54bdf292e5f1134c22e2f3481556ce09b706fa19b not found: ID does not exist" Mar 10 09:36:38 crc kubenswrapper[4883]: I0310 09:36:38.088898 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" path="/var/lib/kubelet/pods/a0d40308-0487-45a0-9ebe-8978ccc4b10b/volumes" Mar 10 09:36:49 crc kubenswrapper[4883]: I0310 09:36:49.591519 4883 generic.go:334] "Generic (PLEG): container finished" podID="d37d0afe-ad64-4616-b877-bd05deefd038" containerID="5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b" exitCode=0 Mar 10 09:36:49 crc kubenswrapper[4883]: I0310 09:36:49.591589 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerDied","Data":"5c34c6a621b1787c79c91995c5363b35386fa4a4b8f3dd41526947887890640b"} Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.948292 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953637 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953702 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953787 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953861 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953912 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.953990 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") pod \"d37d0afe-ad64-4616-b877-bd05deefd038\" (UID: \"d37d0afe-ad64-4616-b877-bd05deefd038\") " Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.960655 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.960660 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch" (OuterVolumeSpecName: "kube-api-access-stcch") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "kube-api-access-stcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.980192 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory" (OuterVolumeSpecName: "inventory") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.984469 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.986420 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:50 crc kubenswrapper[4883]: I0310 09:36:50.986964 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d37d0afe-ad64-4616-b877-bd05deefd038" (UID: "d37d0afe-ad64-4616-b877-bd05deefd038"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.055993 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stcch\" (UniqueName: \"kubernetes.io/projected/d37d0afe-ad64-4616-b877-bd05deefd038-kube-api-access-stcch\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056056 4883 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056071 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056084 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056098 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.056108 4883 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37d0afe-ad64-4616-b877-bd05deefd038-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609030 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" event={"ID":"d37d0afe-ad64-4616-b877-bd05deefd038","Type":"ContainerDied","Data":"1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28"} Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609088 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1447bbc4710c3544c6436b09315bed36a46021e5b37601faa4c7e6f80c6d6f28" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.609407 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.709719 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710420 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-utilities" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710546 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-utilities" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710640 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710706 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710774 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710833 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: E0310 09:36:51.710911 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-content" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.710973 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="extract-content" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.711221 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37d0afe-ad64-4616-b877-bd05deefd038" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.711285 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d40308-0487-45a0-9ebe-8978ccc4b10b" containerName="registry-server" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.712175 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.714550 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715341 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715416 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715340 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.715600 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.717393 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768201 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768400 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768561 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768665 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.768815 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870314 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870634 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.870964 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.871070 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875277 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875502 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.875961 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.876340 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:51 crc kubenswrapper[4883]: I0310 09:36:51.885208 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-thhsw\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.029785 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.473312 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw"] Mar 10 09:36:52 crc kubenswrapper[4883]: I0310 09:36:52.618176 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerStarted","Data":"5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793"} Mar 10 09:36:53 crc kubenswrapper[4883]: I0310 09:36:53.626518 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerStarted","Data":"ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6"} Mar 10 09:36:53 crc kubenswrapper[4883]: I0310 09:36:53.652922 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" podStartSLOduration=2.155237998 podStartE2EDuration="2.652901696s" podCreationTimestamp="2026-03-10 09:36:51 +0000 UTC" firstStartedPulling="2026-03-10 09:36:52.481106129 +0000 UTC m=+1998.736004018" lastFinishedPulling="2026-03-10 09:36:52.978769827 +0000 UTC m=+1999.233667716" observedRunningTime="2026-03-10 09:36:53.640018139 +0000 UTC m=+1999.894916028" watchObservedRunningTime="2026-03-10 09:36:53.652901696 +0000 UTC m=+1999.907799585" Mar 10 09:37:05 crc kubenswrapper[4883]: I0310 09:37:05.673019 4883 scope.go:117] "RemoveContainer" containerID="55b96e78b0c842f2c57bd194eb8366216f87ea2f29ff4952bc2e23513be83089" Mar 10 09:37:17 crc kubenswrapper[4883]: I0310 09:37:17.448954 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:37:17 crc kubenswrapper[4883]: I0310 09:37:17.449542 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:37:47 crc kubenswrapper[4883]: I0310 09:37:47.449197 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:37:47 crc kubenswrapper[4883]: I0310 09:37:47.449741 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.137094 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.139124 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.141465 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.141500 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.143706 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.143801 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.288157 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.390007 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.408306 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"auto-csr-approver-29552258-6pf25\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.454409 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:00 crc kubenswrapper[4883]: I0310 09:38:00.863771 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:38:00 crc kubenswrapper[4883]: W0310 09:38:00.873250 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff28fc4_3134_48ba_9697_b74e9b4e6ec5.slice/crio-ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5 WatchSource:0}: Error finding container ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5: Status 404 returned error can't find the container with id ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5 Mar 10 09:38:01 crc kubenswrapper[4883]: I0310 09:38:01.221444 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerStarted","Data":"ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5"} Mar 10 09:38:03 crc kubenswrapper[4883]: I0310 09:38:03.256886 4883 generic.go:334] "Generic (PLEG): container finished" podID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerID="0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7" exitCode=0 Mar 10 09:38:03 crc kubenswrapper[4883]: I0310 09:38:03.257347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerDied","Data":"0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7"} Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.555116 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.684023 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") pod \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\" (UID: \"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5\") " Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.689988 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc" (OuterVolumeSpecName: "kube-api-access-cl5zc") pod "2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" (UID: "2ff28fc4-3134-48ba-9697-b74e9b4e6ec5"). InnerVolumeSpecName "kube-api-access-cl5zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:38:04 crc kubenswrapper[4883]: I0310 09:38:04.787461 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5zc\" (UniqueName: \"kubernetes.io/projected/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5-kube-api-access-cl5zc\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289146 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552258-6pf25" event={"ID":"2ff28fc4-3134-48ba-9697-b74e9b4e6ec5","Type":"ContainerDied","Data":"ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5"} Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289207 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4cef9d0fe3f52a260595bdab7a0c36bd45dabcab08e3909ee82747ad271dc5" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.289271 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552258-6pf25" Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.613625 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:38:05 crc kubenswrapper[4883]: I0310 09:38:05.620594 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-vcsb9"] Mar 10 09:38:06 crc kubenswrapper[4883]: I0310 09:38:06.091638 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc74aa89-09d6-4974-a6c1-1642f6ef0a64" path="/var/lib/kubelet/pods/fc74aa89-09d6-4974-a6c1-1642f6ef0a64/volumes" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.043277 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: E0310 09:38:17.044875 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.044896 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.045230 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" containerName="oc" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.047595 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.063228 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130504 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.130589 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232312 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232639 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232803 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.232834 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.233046 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.256044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"community-operators-f5q5h\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.372887 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449216 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449275 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.449331 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.450075 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.450148 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" gracePeriod=600 Mar 10 09:38:17 crc kubenswrapper[4883]: I0310 09:38:17.871607 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:17 crc kubenswrapper[4883]: W0310 09:38:17.872231 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd770ddb2_bbf8_4a33_9a98_dddf894c2c86.slice/crio-2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470 WatchSource:0}: Error finding container 2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470: Status 404 returned error can't find the container with id 2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412172 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" exitCode=0 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412456 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.412867 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416579 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" exitCode=0 Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416608 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416624 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} Mar 10 09:38:18 crc kubenswrapper[4883]: I0310 09:38:18.416642 4883 scope.go:117] "RemoveContainer" containerID="cf5cc82ba29ed200c8ebe74d8a0d7556429cda7a881b30aff432af65f772e54b" Mar 10 09:38:19 crc kubenswrapper[4883]: I0310 09:38:19.426327 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} Mar 10 09:38:20 crc kubenswrapper[4883]: I0310 09:38:20.443622 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" exitCode=0 Mar 10 09:38:20 crc kubenswrapper[4883]: I0310 09:38:20.443731 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} Mar 10 09:38:21 crc kubenswrapper[4883]: I0310 09:38:21.457393 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerStarted","Data":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} Mar 10 09:38:21 crc kubenswrapper[4883]: I0310 09:38:21.478728 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f5q5h" podStartSLOduration=1.97855361 podStartE2EDuration="4.478697065s" podCreationTimestamp="2026-03-10 09:38:17 +0000 UTC" firstStartedPulling="2026-03-10 09:38:18.41412307 +0000 UTC m=+2084.669020959" lastFinishedPulling="2026-03-10 09:38:20.914266525 +0000 UTC m=+2087.169164414" observedRunningTime="2026-03-10 09:38:21.473236698 +0000 UTC m=+2087.728134587" watchObservedRunningTime="2026-03-10 09:38:21.478697065 +0000 UTC m=+2087.733594954" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.373261 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.373700 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.417778 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.555605 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:27 crc kubenswrapper[4883]: I0310 09:38:27.650988 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:29 crc kubenswrapper[4883]: I0310 09:38:29.529188 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f5q5h" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" containerID="cri-o://06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" gracePeriod=2 Mar 10 09:38:29 crc kubenswrapper[4883]: I0310 09:38:29.924566 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099563 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099688 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.099853 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") pod \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\" (UID: \"d770ddb2-bbf8-4a33-9a98-dddf894c2c86\") " Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.100577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities" (OuterVolumeSpecName: "utilities") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.107061 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w" (OuterVolumeSpecName: "kube-api-access-kkn7w") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "kube-api-access-kkn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.145856 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d770ddb2-bbf8-4a33-9a98-dddf894c2c86" (UID: "d770ddb2-bbf8-4a33-9a98-dddf894c2c86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204013 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkn7w\" (UniqueName: \"kubernetes.io/projected/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-kube-api-access-kkn7w\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204059 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.204070 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d770ddb2-bbf8-4a33-9a98-dddf894c2c86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542848 4883 generic.go:334] "Generic (PLEG): container finished" podID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" exitCode=0 Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542945 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f5q5h" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.542961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.543342 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f5q5h" event={"ID":"d770ddb2-bbf8-4a33-9a98-dddf894c2c86","Type":"ContainerDied","Data":"2f35b42a530fc3ca7d1430f3f39e8e97df8f3699f0a3f6403991bfd53d588470"} Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.543369 4883 scope.go:117] "RemoveContainer" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.571838 4883 scope.go:117] "RemoveContainer" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.578365 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.598633 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f5q5h"] Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.598886 4883 scope.go:117] "RemoveContainer" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632322 4883 scope.go:117] "RemoveContainer" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.632811 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": container with ID starting with 06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994 not found: ID does not exist" containerID="06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632856 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994"} err="failed to get container status \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": rpc error: code = NotFound desc = could not find container \"06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994\": container with ID starting with 06e3e46b8c0d0cc296c677a0002ecb76a407f70766398b3cb63b723c8fa43994 not found: ID does not exist" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.632888 4883 scope.go:117] "RemoveContainer" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.633532 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": container with ID starting with 00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73 not found: ID does not exist" containerID="00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633561 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73"} err="failed to get container status \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": rpc error: code = NotFound desc = could not find container \"00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73\": container with ID starting with 00f7ba820010963048d82db442f72eacbde596834e7654710b566d1223dbef73 not found: ID does not exist" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633576 4883 scope.go:117] "RemoveContainer" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: E0310 09:38:30.633837 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": container with ID starting with 95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98 not found: ID does not exist" containerID="95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98" Mar 10 09:38:30 crc kubenswrapper[4883]: I0310 09:38:30.633873 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98"} err="failed to get container status \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": rpc error: code = NotFound desc = could not find container \"95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98\": container with ID starting with 95dbe01a07753f92e6400c19ec32fc192941ca8612b4749fcd5f0bf2f0608d98 not found: ID does not exist" Mar 10 09:38:32 crc kubenswrapper[4883]: I0310 09:38:32.089978 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" path="/var/lib/kubelet/pods/d770ddb2-bbf8-4a33-9a98-dddf894c2c86/volumes" Mar 10 09:39:05 crc kubenswrapper[4883]: I0310 09:39:05.786228 4883 scope.go:117] "RemoveContainer" containerID="76a36df1ff76227c193949f769a79a8229f0c35af6ce9046d5c6bb133c432611" Mar 10 09:39:49 crc kubenswrapper[4883]: I0310 09:39:49.259942 4883 generic.go:334] "Generic (PLEG): container finished" podID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerID="ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6" exitCode=0 Mar 10 09:39:49 crc kubenswrapper[4883]: I0310 09:39:49.260042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerDied","Data":"ee183e9dfde8ff8beaedf0c3401eda98426713aedff5f916f72c140e022aa4c6"} Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.577365 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681087 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681125 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681165 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.681284 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") pod \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\" (UID: \"eb3b72a2-945a-4719-87c0-ffaf7eb84b52\") " Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.687260 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn" (OuterVolumeSpecName: "kube-api-access-m6mcn") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "kube-api-access-m6mcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.687811 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705530 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705542 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory" (OuterVolumeSpecName: "inventory") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.705889 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eb3b72a2-945a-4719-87c0-ffaf7eb84b52" (UID: "eb3b72a2-945a-4719-87c0-ffaf7eb84b52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783156 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6mcn\" (UniqueName: \"kubernetes.io/projected/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-kube-api-access-m6mcn\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783186 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783197 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783205 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:50 crc kubenswrapper[4883]: I0310 09:39:50.783213 4883 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3b72a2-945a-4719-87c0-ffaf7eb84b52-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275126 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" event={"ID":"eb3b72a2-945a-4719-87c0-ffaf7eb84b52","Type":"ContainerDied","Data":"5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793"} Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275173 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5132af8adf7ec0c2f156b654b7c9255b712406e2697f68910d1bb75a882c3793" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.275183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-thhsw" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.351525 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352134 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352152 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352178 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-content" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352184 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-content" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352205 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352212 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: E0310 09:39:51.352227 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-utilities" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352233 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="extract-utilities" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352400 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d770ddb2-bbf8-4a33-9a98-dddf894c2c86" containerName="registry-server" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.352415 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3b72a2-945a-4719-87c0-ffaf7eb84b52" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.353025 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355342 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355531 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355709 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355831 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.355966 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.356084 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.356211 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.362288 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396862 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396908 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.396935 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397052 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397126 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397199 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397307 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397374 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397404 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.397427 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499345 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499425 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499512 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499538 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.499782 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500425 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500508 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.500781 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501232 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501288 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501315 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.501376 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506486 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506660 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.506892 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507135 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507380 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.507946 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.514275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.515255 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.516900 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.519326 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-47dxf\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:51 crc kubenswrapper[4883]: I0310 09:39:51.665396 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:39:52 crc kubenswrapper[4883]: I0310 09:39:52.134832 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf"] Mar 10 09:39:52 crc kubenswrapper[4883]: I0310 09:39:52.283441 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerStarted","Data":"3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5"} Mar 10 09:39:53 crc kubenswrapper[4883]: I0310 09:39:53.294570 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerStarted","Data":"4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5"} Mar 10 09:39:53 crc kubenswrapper[4883]: I0310 09:39:53.317549 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" podStartSLOduration=1.796281427 podStartE2EDuration="2.317527158s" podCreationTimestamp="2026-03-10 09:39:51 +0000 UTC" firstStartedPulling="2026-03-10 09:39:52.138672604 +0000 UTC m=+2178.393570493" lastFinishedPulling="2026-03-10 09:39:52.659918335 +0000 UTC m=+2178.914816224" observedRunningTime="2026-03-10 09:39:53.311018793 +0000 UTC m=+2179.565916682" watchObservedRunningTime="2026-03-10 09:39:53.317527158 +0000 UTC m=+2179.572425047" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.142086 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.143778 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.145297 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.145780 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.146117 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.156490 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.216994 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.318006 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.335535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"auto-csr-approver-29552260-m5d6m\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.461876 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:00 crc kubenswrapper[4883]: I0310 09:40:00.892224 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:40:01 crc kubenswrapper[4883]: I0310 09:40:01.363875 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerStarted","Data":"9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3"} Mar 10 09:40:03 crc kubenswrapper[4883]: I0310 09:40:03.383416 4883 generic.go:334] "Generic (PLEG): container finished" podID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerID="43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872" exitCode=0 Mar 10 09:40:03 crc kubenswrapper[4883]: I0310 09:40:03.383517 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerDied","Data":"43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872"} Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.723046 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.912622 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") pod \"9c22b88a-ce63-4c4a-a606-17c563e9e156\" (UID: \"9c22b88a-ce63-4c4a-a606-17c563e9e156\") " Mar 10 09:40:04 crc kubenswrapper[4883]: I0310 09:40:04.919939 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr" (OuterVolumeSpecName: "kube-api-access-n8dlr") pod "9c22b88a-ce63-4c4a-a606-17c563e9e156" (UID: "9c22b88a-ce63-4c4a-a606-17c563e9e156"). InnerVolumeSpecName "kube-api-access-n8dlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.014387 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8dlr\" (UniqueName: \"kubernetes.io/projected/9c22b88a-ce63-4c4a-a606-17c563e9e156-kube-api-access-n8dlr\") on node \"crc\" DevicePath \"\"" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.402938 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" event={"ID":"9c22b88a-ce63-4c4a-a606-17c563e9e156","Type":"ContainerDied","Data":"9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3"} Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.402988 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1442d84ebc314d87a87c497d41a78e35b6dda32220bfadc36e5e2d1a796ff3" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.403000 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552260-m5d6m" Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.779130 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:40:05 crc kubenswrapper[4883]: I0310 09:40:05.784110 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-d9q7p"] Mar 10 09:40:06 crc kubenswrapper[4883]: I0310 09:40:06.090720 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b477b90a-75af-4621-8c33-21fdd8c9c749" path="/var/lib/kubelet/pods/b477b90a-75af-4621-8c33-21fdd8c9c749/volumes" Mar 10 09:40:17 crc kubenswrapper[4883]: I0310 09:40:17.449106 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:40:17 crc kubenswrapper[4883]: I0310 09:40:17.449826 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:40:47 crc kubenswrapper[4883]: I0310 09:40:47.449312 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:40:47 crc kubenswrapper[4883]: I0310 09:40:47.450111 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:41:05 crc kubenswrapper[4883]: I0310 09:41:05.893658 4883 scope.go:117] "RemoveContainer" containerID="6864877f7abf0513eaa87f372fd6fb5c7baab57f240f7c0cd19def879aaf0dc8" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.448918 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.449616 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.449672 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.450315 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:41:17 crc kubenswrapper[4883]: I0310 09:41:17.450393 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" gracePeriod=600 Mar 10 09:41:17 crc kubenswrapper[4883]: E0310 09:41:17.578442 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046205 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" exitCode=0 Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046265 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b"} Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.046864 4883 scope.go:117] "RemoveContainer" containerID="5c1139e014fa94ac8c68493a0442f3c82dbfbb40d973771c846961471e7e8c0d" Mar 10 09:41:18 crc kubenswrapper[4883]: I0310 09:41:18.047650 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:18 crc kubenswrapper[4883]: E0310 09:41:18.048043 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:33 crc kubenswrapper[4883]: I0310 09:41:33.080383 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:33 crc kubenswrapper[4883]: E0310 09:41:33.081322 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.467181 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:35 crc kubenswrapper[4883]: E0310 09:41:35.467929 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.467944 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.468127 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" containerName="oc" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.469423 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.473977 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.573886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.573931 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.574660 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.659505 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.661256 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.673804 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676238 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676275 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676578 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676796 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.676843 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.697252 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"certified-operators-df97s\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.778932 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.779069 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.779096 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.786223 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881182 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881413 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881712 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.881825 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.897427 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"redhat-operators-2s8h8\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:35 crc kubenswrapper[4883]: I0310 09:41:35.977165 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:36 crc kubenswrapper[4883]: I0310 09:41:36.272768 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:36 crc kubenswrapper[4883]: I0310 09:41:36.443871 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.224864 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" exitCode=0 Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.224993 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.225036 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"9e2949119360f4c1ef4ce52ba4f41c2ada91976532782b690ae2143396e06593"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227235 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227536 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" exitCode=0 Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227626 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593"} Mar 10 09:41:37 crc kubenswrapper[4883]: I0310 09:41:37.227821 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerStarted","Data":"451f307721bd95bb6eec1741406d0e7d216607c63a0692d2884bbb664bacc51f"} Mar 10 09:41:38 crc kubenswrapper[4883]: I0310 09:41:38.242856 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" exitCode=0 Mar 10 09:41:38 crc kubenswrapper[4883]: I0310 09:41:38.242956 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.258160 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerStarted","Data":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.261425 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} Mar 10 09:41:39 crc kubenswrapper[4883]: I0310 09:41:39.284609 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df97s" podStartSLOduration=2.804009525 podStartE2EDuration="4.284597673s" podCreationTimestamp="2026-03-10 09:41:35 +0000 UTC" firstStartedPulling="2026-03-10 09:41:37.229797687 +0000 UTC m=+2283.484695577" lastFinishedPulling="2026-03-10 09:41:38.710385835 +0000 UTC m=+2284.965283725" observedRunningTime="2026-03-10 09:41:39.281296206 +0000 UTC m=+2285.536194096" watchObservedRunningTime="2026-03-10 09:41:39.284597673 +0000 UTC m=+2285.539495563" Mar 10 09:41:41 crc kubenswrapper[4883]: I0310 09:41:41.282813 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" exitCode=0 Mar 10 09:41:41 crc kubenswrapper[4883]: I0310 09:41:41.282946 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.294454 4883 generic.go:334] "Generic (PLEG): container finished" podID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerID="4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5" exitCode=0 Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.294546 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerDied","Data":"4951ccf796499d6b47c904d8481fa70523d42eff658bee0de028c1741f6614b5"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.298785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerStarted","Data":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} Mar 10 09:41:42 crc kubenswrapper[4883]: I0310 09:41:42.330017 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2s8h8" podStartSLOduration=2.744273642 podStartE2EDuration="7.330002342s" podCreationTimestamp="2026-03-10 09:41:35 +0000 UTC" firstStartedPulling="2026-03-10 09:41:37.226996625 +0000 UTC m=+2283.481894514" lastFinishedPulling="2026-03-10 09:41:41.812725325 +0000 UTC m=+2288.067623214" observedRunningTime="2026-03-10 09:41:42.325796621 +0000 UTC m=+2288.580694510" watchObservedRunningTime="2026-03-10 09:41:42.330002342 +0000 UTC m=+2288.584900231" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.675865 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.854928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855026 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855062 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855181 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855414 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855448 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855568 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855629 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855724 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.855769 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") pod \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\" (UID: \"af134b73-8c24-4b9e-b15e-48ff4b83ecd4\") " Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.865150 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2" (OuterVolumeSpecName: "kube-api-access-jd8p2") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "kube-api-access-jd8p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.877276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.881232 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.882265 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.882811 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.884736 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.886923 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.887374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.891726 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.893866 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.897390 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory" (OuterVolumeSpecName: "inventory") pod "af134b73-8c24-4b9e-b15e-48ff4b83ecd4" (UID: "af134b73-8c24-4b9e-b15e-48ff4b83ecd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960125 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960156 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960168 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd8p2\" (UniqueName: \"kubernetes.io/projected/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-kube-api-access-jd8p2\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960182 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960192 4883 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960201 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960213 4883 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960221 4883 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960232 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960245 4883 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:43 crc kubenswrapper[4883]: I0310 09:41:43.960254 4883 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/af134b73-8c24-4b9e-b15e-48ff4b83ecd4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.104321 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:44 crc kubenswrapper[4883]: E0310 09:41:44.104982 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315169 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" event={"ID":"af134b73-8c24-4b9e-b15e-48ff4b83ecd4","Type":"ContainerDied","Data":"3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5"} Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315630 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f29108c5dad1a5578dac32057489e322fee1b450f1de55af05d05420ce128d5" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.315238 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-47dxf" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.407861 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:44 crc kubenswrapper[4883]: E0310 09:41:44.408304 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.408324 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.408544 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="af134b73-8c24-4b9e-b15e-48ff4b83ecd4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.409256 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413254 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413271 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413498 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jqzh5" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413652 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.413800 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.420934 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471034 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471110 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471268 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471327 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471442 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.471579 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573150 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573253 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573302 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573329 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573409 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573447 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.573503 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.578276 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.578974 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579270 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579469 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.579760 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.582344 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.590325 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-blk56\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:44 crc kubenswrapper[4883]: I0310 09:41:44.728204 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.247685 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56"] Mar 10 09:41:45 crc kubenswrapper[4883]: W0310 09:41:45.250072 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb083d3b3_edb7_4d2f_a7b7_f1275bd83fde.slice/crio-63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce WatchSource:0}: Error finding container 63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce: Status 404 returned error can't find the container with id 63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.323256 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerStarted","Data":"63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce"} Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.786587 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.787442 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.830503 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.977701 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:45 crc kubenswrapper[4883]: I0310 09:41:45.977756 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.332772 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerStarted","Data":"6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18"} Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.357841 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" podStartSLOduration=1.677354179 podStartE2EDuration="2.357819054s" podCreationTimestamp="2026-03-10 09:41:44 +0000 UTC" firstStartedPulling="2026-03-10 09:41:45.252626725 +0000 UTC m=+2291.507524605" lastFinishedPulling="2026-03-10 09:41:45.933091592 +0000 UTC m=+2292.187989480" observedRunningTime="2026-03-10 09:41:46.353019403 +0000 UTC m=+2292.607917292" watchObservedRunningTime="2026-03-10 09:41:46.357819054 +0000 UTC m=+2292.612716943" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.378914 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:46 crc kubenswrapper[4883]: I0310 09:41:46.431835 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:47 crc kubenswrapper[4883]: I0310 09:41:47.018879 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2s8h8" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" probeResult="failure" output=< Mar 10 09:41:47 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:41:47 crc kubenswrapper[4883]: > Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.349336 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df97s" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" containerID="cri-o://3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" gracePeriod=2 Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.737374 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870017 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870190 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") pod \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\" (UID: \"109ee6ef-7197-40d5-82cf-4b34bcad2ecc\") " Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.870929 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities" (OuterVolumeSpecName: "utilities") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.871356 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.876673 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf" (OuterVolumeSpecName: "kube-api-access-ndklf") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "kube-api-access-ndklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.915487 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "109ee6ef-7197-40d5-82cf-4b34bcad2ecc" (UID: "109ee6ef-7197-40d5-82cf-4b34bcad2ecc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.973799 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:48 crc kubenswrapper[4883]: I0310 09:41:48.973831 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndklf\" (UniqueName: \"kubernetes.io/projected/109ee6ef-7197-40d5-82cf-4b34bcad2ecc-kube-api-access-ndklf\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367320 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df97s" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367324 4883 generic.go:334] "Generic (PLEG): container finished" podID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" exitCode=0 Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367315 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367512 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df97s" event={"ID":"109ee6ef-7197-40d5-82cf-4b34bcad2ecc","Type":"ContainerDied","Data":"451f307721bd95bb6eec1741406d0e7d216607c63a0692d2884bbb664bacc51f"} Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.367543 4883 scope.go:117] "RemoveContainer" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.406396 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.409823 4883 scope.go:117] "RemoveContainer" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.419338 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df97s"] Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.436251 4883 scope.go:117] "RemoveContainer" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473123 4883 scope.go:117] "RemoveContainer" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.473648 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": container with ID starting with 3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0 not found: ID does not exist" containerID="3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473691 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0"} err="failed to get container status \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": rpc error: code = NotFound desc = could not find container \"3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0\": container with ID starting with 3dbfd6561cd94319dc6333af3cfadcf93f36959eea013206541fbf01459823c0 not found: ID does not exist" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.473729 4883 scope.go:117] "RemoveContainer" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.474345 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": container with ID starting with 75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8 not found: ID does not exist" containerID="75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474405 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8"} err="failed to get container status \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": rpc error: code = NotFound desc = could not find container \"75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8\": container with ID starting with 75f76faddf541f9c7343618d4590d633f426042264b45b551f8c7fc2f8eb1fb8 not found: ID does not exist" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474446 4883 scope.go:117] "RemoveContainer" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: E0310 09:41:49.474936 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": container with ID starting with ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593 not found: ID does not exist" containerID="ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593" Mar 10 09:41:49 crc kubenswrapper[4883]: I0310 09:41:49.474967 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593"} err="failed to get container status \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": rpc error: code = NotFound desc = could not find container \"ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593\": container with ID starting with ea0f5cb737734e7edf16ea3a2ae2f7038d4c37e7a9f0f49f68aabc7d0730e593 not found: ID does not exist" Mar 10 09:41:50 crc kubenswrapper[4883]: I0310 09:41:50.091774 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" path="/var/lib/kubelet/pods/109ee6ef-7197-40d5-82cf-4b34bcad2ecc/volumes" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.020295 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.065792 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:56 crc kubenswrapper[4883]: I0310 09:41:56.084764 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:41:56 crc kubenswrapper[4883]: E0310 09:41:56.085152 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.055418 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.445280 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2s8h8" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" containerID="cri-o://ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" gracePeriod=2 Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.845672 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966116 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966272 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966538 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") pod \"d82be883-fb56-4980-855b-29e3c65804f0\" (UID: \"d82be883-fb56-4980-855b-29e3c65804f0\") " Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.966958 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities" (OuterVolumeSpecName: "utilities") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.967086 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:57 crc kubenswrapper[4883]: I0310 09:41:57.972980 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg" (OuterVolumeSpecName: "kube-api-access-zl7lg") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "kube-api-access-zl7lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.061212 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d82be883-fb56-4980-855b-29e3c65804f0" (UID: "d82be883-fb56-4980-855b-29e3c65804f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.069510 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d82be883-fb56-4980-855b-29e3c65804f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.069546 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl7lg\" (UniqueName: \"kubernetes.io/projected/d82be883-fb56-4980-855b-29e3c65804f0-kube-api-access-zl7lg\") on node \"crc\" DevicePath \"\"" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455298 4883 generic.go:334] "Generic (PLEG): container finished" podID="d82be883-fb56-4980-855b-29e3c65804f0" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" exitCode=0 Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455353 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455365 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s8h8" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455393 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s8h8" event={"ID":"d82be883-fb56-4980-855b-29e3c65804f0","Type":"ContainerDied","Data":"9e2949119360f4c1ef4ce52ba4f41c2ada91976532782b690ae2143396e06593"} Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.455415 4883 scope.go:117] "RemoveContainer" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.477033 4883 scope.go:117] "RemoveContainer" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.481138 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.487306 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2s8h8"] Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.495882 4883 scope.go:117] "RemoveContainer" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.535565 4883 scope.go:117] "RemoveContainer" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.536175 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": container with ID starting with ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468 not found: ID does not exist" containerID="ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536207 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468"} err="failed to get container status \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": rpc error: code = NotFound desc = could not find container \"ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468\": container with ID starting with ee8154982f7ba8a090196b403cb86fcab07b4891e9e958f227e782c8c6595468 not found: ID does not exist" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536230 4883 scope.go:117] "RemoveContainer" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.536748 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": container with ID starting with 8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5 not found: ID does not exist" containerID="8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536790 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5"} err="failed to get container status \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": rpc error: code = NotFound desc = could not find container \"8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5\": container with ID starting with 8458eae7c2989f15d17326cda51473bb22714ce86224dde78869c754c53854b5 not found: ID does not exist" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.536821 4883 scope.go:117] "RemoveContainer" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: E0310 09:41:58.537257 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": container with ID starting with 3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2 not found: ID does not exist" containerID="3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2" Mar 10 09:41:58 crc kubenswrapper[4883]: I0310 09:41:58.537285 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2"} err="failed to get container status \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": rpc error: code = NotFound desc = could not find container \"3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2\": container with ID starting with 3326982d2c0524d73c1883bb79ffe03cfa2559772f8873ef264009c7f83fd8e2 not found: ID does not exist" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.104901 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82be883-fb56-4980-855b-29e3c65804f0" path="/var/lib/kubelet/pods/d82be883-fb56-4980-855b-29e3c65804f0/volumes" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.137752 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138195 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138210 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138236 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138243 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-content" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138260 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138266 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138288 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138294 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138307 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138313 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: E0310 09:42:00.138333 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138339 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="extract-utilities" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138603 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="109ee6ef-7197-40d5-82cf-4b34bcad2ecc" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.138614 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82be883-fb56-4980-855b-29e3c65804f0" containerName="registry-server" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.139407 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.141782 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.141954 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.142428 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.149610 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.213232 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.315560 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.344018 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"auto-csr-approver-29552262-5shjs\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.458838 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:00 crc kubenswrapper[4883]: I0310 09:42:00.885114 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:42:00 crc kubenswrapper[4883]: W0310 09:42:00.888913 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a7f73ce_4bec_451b_8fc7_a787366b6001.slice/crio-3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265 WatchSource:0}: Error finding container 3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265: Status 404 returned error can't find the container with id 3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265 Mar 10 09:42:01 crc kubenswrapper[4883]: I0310 09:42:01.484514 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerStarted","Data":"3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265"} Mar 10 09:42:02 crc kubenswrapper[4883]: I0310 09:42:02.493025 4883 generic.go:334] "Generic (PLEG): container finished" podID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerID="cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81" exitCode=0 Mar 10 09:42:02 crc kubenswrapper[4883]: I0310 09:42:02.493138 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerDied","Data":"cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81"} Mar 10 09:42:03 crc kubenswrapper[4883]: I0310 09:42:03.802607 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:03 crc kubenswrapper[4883]: I0310 09:42:03.995205 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") pod \"5a7f73ce-4bec-451b-8fc7-a787366b6001\" (UID: \"5a7f73ce-4bec-451b-8fc7-a787366b6001\") " Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.001694 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987" (OuterVolumeSpecName: "kube-api-access-6x987") pod "5a7f73ce-4bec-451b-8fc7-a787366b6001" (UID: "5a7f73ce-4bec-451b-8fc7-a787366b6001"). InnerVolumeSpecName "kube-api-access-6x987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.098466 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x987\" (UniqueName: \"kubernetes.io/projected/5a7f73ce-4bec-451b-8fc7-a787366b6001-kube-api-access-6x987\") on node \"crc\" DevicePath \"\"" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552262-5shjs" event={"ID":"5a7f73ce-4bec-451b-8fc7-a787366b6001","Type":"ContainerDied","Data":"3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265"} Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513695 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552262-5shjs" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.513716 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee1abec5a1d75f23926eaa3527591ce75a911a2ab2367282c8a5a4b675ad265" Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.879608 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:42:04 crc kubenswrapper[4883]: I0310 09:42:04.887255 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552256-fmqzj"] Mar 10 09:42:06 crc kubenswrapper[4883]: I0310 09:42:06.090312 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70e8b0b-51ad-4080-8955-8aa8ee68f274" path="/var/lib/kubelet/pods/c70e8b0b-51ad-4080-8955-8aa8ee68f274/volumes" Mar 10 09:42:08 crc kubenswrapper[4883]: I0310 09:42:08.080282 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:08 crc kubenswrapper[4883]: E0310 09:42:08.081236 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:20 crc kubenswrapper[4883]: I0310 09:42:20.080005 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:20 crc kubenswrapper[4883]: E0310 09:42:20.080953 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:33 crc kubenswrapper[4883]: I0310 09:42:33.080591 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:33 crc kubenswrapper[4883]: E0310 09:42:33.081387 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:45 crc kubenswrapper[4883]: I0310 09:42:45.080435 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:45 crc kubenswrapper[4883]: E0310 09:42:45.081408 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:42:56 crc kubenswrapper[4883]: I0310 09:42:56.080470 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:42:56 crc kubenswrapper[4883]: E0310 09:42:56.081732 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:06 crc kubenswrapper[4883]: I0310 09:43:06.008189 4883 scope.go:117] "RemoveContainer" containerID="f3c4bd29935ee27cc39cdc84c3f636e36dc2e8b9c1bfc85b30473304c9e1a2ff" Mar 10 09:43:09 crc kubenswrapper[4883]: I0310 09:43:09.080360 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:09 crc kubenswrapper[4883]: E0310 09:43:09.081621 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:22 crc kubenswrapper[4883]: I0310 09:43:22.079563 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:22 crc kubenswrapper[4883]: E0310 09:43:22.080493 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:36 crc kubenswrapper[4883]: I0310 09:43:36.079493 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:36 crc kubenswrapper[4883]: E0310 09:43:36.080412 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:43:38 crc kubenswrapper[4883]: I0310 09:43:38.314510 4883 generic.go:334] "Generic (PLEG): container finished" podID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerID="6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18" exitCode=0 Mar 10 09:43:38 crc kubenswrapper[4883]: I0310 09:43:38.314585 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerDied","Data":"6fc9eb7a0760205fbfc253c3603eaaef5b472d6ed42e6664b249cce521606f18"} Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.644856 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.719953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.720000 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.742997 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.743740 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821164 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821246 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821300 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821331 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821362 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") pod \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\" (UID: \"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde\") " Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821677 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.821694 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.823967 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt" (OuterVolumeSpecName: "kube-api-access-lm9lt") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "kube-api-access-lm9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.824170 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.839888 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.840069 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.841092 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory" (OuterVolumeSpecName: "inventory") pod "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" (UID: "b083d3b3-edb7-4d2f-a7b7-f1275bd83fde"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924242 4883 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924532 4883 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924543 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924553 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm9lt\" (UniqueName: \"kubernetes.io/projected/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-kube-api-access-lm9lt\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:39 crc kubenswrapper[4883]: I0310 09:43:39.924562 4883 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b083d3b3-edb7-4d2f-a7b7-f1275bd83fde-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332347 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" event={"ID":"b083d3b3-edb7-4d2f-a7b7-f1275bd83fde","Type":"ContainerDied","Data":"63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce"} Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332616 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a049843d7dd2e2c005a5680366d35338216e5278e83ba6e24e822f31c706ce" Mar 10 09:43:40 crc kubenswrapper[4883]: I0310 09:43:40.332396 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-blk56" Mar 10 09:43:47 crc kubenswrapper[4883]: I0310 09:43:47.080645 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:43:47 crc kubenswrapper[4883]: E0310 09:43:47.081768 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.080369 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.081319 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.139522 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.139985 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140006 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: E0310 09:44:00.140017 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140023 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140227 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="b083d3b3-edb7-4d2f-a7b7-f1275bd83fde" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140251 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" containerName="oc" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.140959 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.143973 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.144139 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.144634 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.149499 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.177337 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.279911 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.298317 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"auto-csr-approver-29552264-fxbzh\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.459822 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:00 crc kubenswrapper[4883]: I0310 09:44:00.888918 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:44:00 crc kubenswrapper[4883]: W0310 09:44:00.898963 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7561a55_c8cc_4fad_99cf_6a81612efa5f.slice/crio-944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7 WatchSource:0}: Error finding container 944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7: Status 404 returned error can't find the container with id 944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7 Mar 10 09:44:01 crc kubenswrapper[4883]: I0310 09:44:01.512129 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerStarted","Data":"944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7"} Mar 10 09:44:02 crc kubenswrapper[4883]: I0310 09:44:02.532141 4883 generic.go:334] "Generic (PLEG): container finished" podID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerID="7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f" exitCode=0 Mar 10 09:44:02 crc kubenswrapper[4883]: I0310 09:44:02.532235 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerDied","Data":"7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f"} Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.830727 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.962105 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") pod \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\" (UID: \"e7561a55-c8cc-4fad-99cf-6a81612efa5f\") " Mar 10 09:44:03 crc kubenswrapper[4883]: I0310 09:44:03.973759 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j" (OuterVolumeSpecName: "kube-api-access-m8h9j") pod "e7561a55-c8cc-4fad-99cf-6a81612efa5f" (UID: "e7561a55-c8cc-4fad-99cf-6a81612efa5f"). InnerVolumeSpecName "kube-api-access-m8h9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.065305 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8h9j\" (UniqueName: \"kubernetes.io/projected/e7561a55-c8cc-4fad-99cf-6a81612efa5f-kube-api-access-m8h9j\") on node \"crc\" DevicePath \"\"" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554029 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" event={"ID":"e7561a55-c8cc-4fad-99cf-6a81612efa5f","Type":"ContainerDied","Data":"944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7"} Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554347 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944177a2bfa8d0029ced0a2644b9188cbe3400a6beb4e31b0d0458119eda92e7" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.554091 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552264-fxbzh" Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.892907 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:44:04 crc kubenswrapper[4883]: I0310 09:44:04.903866 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552258-6pf25"] Mar 10 09:44:06 crc kubenswrapper[4883]: I0310 09:44:06.092020 4883 scope.go:117] "RemoveContainer" containerID="0172a22ca1b2674eaf7b22f058338bb6d8a1a070eb6300e099b6179e3eec55d7" Mar 10 09:44:06 crc kubenswrapper[4883]: I0310 09:44:06.095344 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff28fc4-3134-48ba-9697-b74e9b4e6ec5" path="/var/lib/kubelet/pods/2ff28fc4-3134-48ba-9697-b74e9b4e6ec5/volumes" Mar 10 09:44:12 crc kubenswrapper[4883]: I0310 09:44:12.080373 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:12 crc kubenswrapper[4883]: E0310 09:44:12.081138 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:24 crc kubenswrapper[4883]: I0310 09:44:24.086231 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:24 crc kubenswrapper[4883]: E0310 09:44:24.087371 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.079410 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:25 crc kubenswrapper[4883]: E0310 09:44:25.080263 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.080333 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.080640 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" containerName="oc" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.081746 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.083442 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.084129 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.084179 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.085207 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fm4md" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.085322 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143384 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143460 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143545 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143584 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143625 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143655 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143710 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143735 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.143767 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245550 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245630 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245670 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245701 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245732 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245780 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245802 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245844 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.245881 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.246577 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.246667 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247022 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247044 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.247275 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.252067 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.252878 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.254190 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.260805 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.268007 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.404442 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:44:25 crc kubenswrapper[4883]: I0310 09:44:25.828204 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:44:26 crc kubenswrapper[4883]: I0310 09:44:26.750063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerStarted","Data":"5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286"} Mar 10 09:44:36 crc kubenswrapper[4883]: I0310 09:44:36.081531 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:36 crc kubenswrapper[4883]: E0310 09:44:36.082684 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:49 crc kubenswrapper[4883]: I0310 09:44:49.080397 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:44:49 crc kubenswrapper[4883]: E0310 09:44:49.081362 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.579062 4883 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.579596 4883 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d483d791-15b3-49e7-8095-5660a9d0fdaa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:44:54 crc kubenswrapper[4883]: E0310 09:44:54.580819 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" Mar 10 09:44:55 crc kubenswrapper[4883]: E0310 09:44:55.047240 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.148547 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.150260 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.151451 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.152951 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.153861 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.261238 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.261616 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.262084 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364833 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364913 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.364961 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.365840 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.371862 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.381650 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"collect-profiles-29552265-f9vj7\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.469052 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:00 crc kubenswrapper[4883]: I0310 09:45:00.870227 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7"] Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.080009 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:01 crc kubenswrapper[4883]: E0310 09:45:01.080236 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.094827 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerStarted","Data":"b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd"} Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.094879 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerStarted","Data":"450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db"} Mar 10 09:45:01 crc kubenswrapper[4883]: I0310 09:45:01.110925 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" podStartSLOduration=1.110904033 podStartE2EDuration="1.110904033s" podCreationTimestamp="2026-03-10 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:45:01.107442426 +0000 UTC m=+2487.362340315" watchObservedRunningTime="2026-03-10 09:45:01.110904033 +0000 UTC m=+2487.365801922" Mar 10 09:45:02 crc kubenswrapper[4883]: I0310 09:45:02.109561 4883 generic.go:334] "Generic (PLEG): container finished" podID="71905d96-5939-40cc-99ff-40da96706a63" containerID="b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd" exitCode=0 Mar 10 09:45:02 crc kubenswrapper[4883]: I0310 09:45:02.109881 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerDied","Data":"b70c70acb4f545566eaae90a1cb0e6aa80d1cb1b44c83724d42075e959d24dbd"} Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.410221 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.539943 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540034 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540066 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") pod \"71905d96-5939-40cc-99ff-40da96706a63\" (UID: \"71905d96-5939-40cc-99ff-40da96706a63\") " Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.540539 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume" (OuterVolumeSpecName: "config-volume") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.546974 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.547039 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb" (OuterVolumeSpecName: "kube-api-access-9xkbb") pod "71905d96-5939-40cc-99ff-40da96706a63" (UID: "71905d96-5939-40cc-99ff-40da96706a63"). InnerVolumeSpecName "kube-api-access-9xkbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643263 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71905d96-5939-40cc-99ff-40da96706a63-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643558 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71905d96-5939-40cc-99ff-40da96706a63-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:03 crc kubenswrapper[4883]: I0310 09:45:03.643572 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xkbb\" (UniqueName: \"kubernetes.io/projected/71905d96-5939-40cc-99ff-40da96706a63-kube-api-access-9xkbb\") on node \"crc\" DevicePath \"\"" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127708 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" event={"ID":"71905d96-5939-40cc-99ff-40da96706a63","Type":"ContainerDied","Data":"450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db"} Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127754 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450ab2e2dbe2709169f581b8f1940b8e48c5198bc1a429dda05302a678f451db" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.127806 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552265-f9vj7" Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.482223 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:45:04 crc kubenswrapper[4883]: I0310 09:45:04.488658 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-k6w5g"] Mar 10 09:45:06 crc kubenswrapper[4883]: I0310 09:45:06.090639 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be14f8e-b9d8-4058-9be3-cdc61ce88626" path="/var/lib/kubelet/pods/0be14f8e-b9d8-4058-9be3-cdc61ce88626/volumes" Mar 10 09:45:06 crc kubenswrapper[4883]: I0310 09:45:06.170700 4883 scope.go:117] "RemoveContainer" containerID="d15aaac8a80d44b120c58eb49e71ed5d06a04daa936e43f5e09fe3fbce5d0142" Mar 10 09:45:07 crc kubenswrapper[4883]: I0310 09:45:07.645687 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:45:09 crc kubenswrapper[4883]: I0310 09:45:09.178734 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerStarted","Data":"20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f"} Mar 10 09:45:09 crc kubenswrapper[4883]: I0310 09:45:09.198971 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.3887127870000002 podStartE2EDuration="45.198932078s" podCreationTimestamp="2026-03-10 09:44:24 +0000 UTC" firstStartedPulling="2026-03-10 09:44:25.832184966 +0000 UTC m=+2452.087082855" lastFinishedPulling="2026-03-10 09:45:07.642404257 +0000 UTC m=+2493.897302146" observedRunningTime="2026-03-10 09:45:09.196741778 +0000 UTC m=+2495.451639667" watchObservedRunningTime="2026-03-10 09:45:09.198932078 +0000 UTC m=+2495.453829967" Mar 10 09:45:16 crc kubenswrapper[4883]: I0310 09:45:16.079715 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:16 crc kubenswrapper[4883]: E0310 09:45:16.080656 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:30 crc kubenswrapper[4883]: I0310 09:45:30.080170 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:30 crc kubenswrapper[4883]: E0310 09:45:30.080972 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:44 crc kubenswrapper[4883]: I0310 09:45:44.086439 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:44 crc kubenswrapper[4883]: E0310 09:45:44.087301 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:45:59 crc kubenswrapper[4883]: I0310 09:45:59.080009 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:45:59 crc kubenswrapper[4883]: E0310 09:45:59.080803 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.136694 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:00 crc kubenswrapper[4883]: E0310 09:46:00.137158 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.137174 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.137373 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71905d96-5939-40cc-99ff-40da96706a63" containerName="collect-profiles" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.138087 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.139996 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.140169 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.142767 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.143631 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.313295 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.415716 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.432525 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"auto-csr-approver-29552266-685zg\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.456951 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:00 crc kubenswrapper[4883]: I0310 09:46:00.852244 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:46:01 crc kubenswrapper[4883]: I0310 09:46:01.621712 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerStarted","Data":"d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de"} Mar 10 09:46:02 crc kubenswrapper[4883]: I0310 09:46:02.633726 4883 generic.go:334] "Generic (PLEG): container finished" podID="3a46f17b-70fa-415b-a58a-05fabe683062" containerID="8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d" exitCode=0 Mar 10 09:46:02 crc kubenswrapper[4883]: I0310 09:46:02.633851 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerDied","Data":"8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d"} Mar 10 09:46:03 crc kubenswrapper[4883]: I0310 09:46:03.956281 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.097140 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") pod \"3a46f17b-70fa-415b-a58a-05fabe683062\" (UID: \"3a46f17b-70fa-415b-a58a-05fabe683062\") " Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.103815 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9" (OuterVolumeSpecName: "kube-api-access-6pjq9") pod "3a46f17b-70fa-415b-a58a-05fabe683062" (UID: "3a46f17b-70fa-415b-a58a-05fabe683062"). InnerVolumeSpecName "kube-api-access-6pjq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.201060 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pjq9\" (UniqueName: \"kubernetes.io/projected/3a46f17b-70fa-415b-a58a-05fabe683062-kube-api-access-6pjq9\") on node \"crc\" DevicePath \"\"" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654105 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552266-685zg" event={"ID":"3a46f17b-70fa-415b-a58a-05fabe683062","Type":"ContainerDied","Data":"d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de"} Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654165 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9b18f5de6dd8fcb31bd49f075439310a45028a1667415604446e36a280624de" Mar 10 09:46:04 crc kubenswrapper[4883]: I0310 09:46:04.654183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552266-685zg" Mar 10 09:46:05 crc kubenswrapper[4883]: I0310 09:46:05.018748 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:46:05 crc kubenswrapper[4883]: I0310 09:46:05.025172 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552260-m5d6m"] Mar 10 09:46:06 crc kubenswrapper[4883]: I0310 09:46:06.091861 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c22b88a-ce63-4c4a-a606-17c563e9e156" path="/var/lib/kubelet/pods/9c22b88a-ce63-4c4a-a606-17c563e9e156/volumes" Mar 10 09:46:06 crc kubenswrapper[4883]: I0310 09:46:06.233733 4883 scope.go:117] "RemoveContainer" containerID="43ec4b59a40b1bfb044dd60703dd00f3cac266b590c5664aa4f0403a553e3872" Mar 10 09:46:11 crc kubenswrapper[4883]: I0310 09:46:11.081305 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:46:11 crc kubenswrapper[4883]: E0310 09:46:11.082657 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:46:23 crc kubenswrapper[4883]: I0310 09:46:23.079388 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:46:23 crc kubenswrapper[4883]: I0310 09:46:23.843294 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.718321 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:09 crc kubenswrapper[4883]: E0310 09:47:09.719546 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.719562 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.719777 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" containerName="oc" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.721347 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.739236 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823673 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823805 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.823887 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926102 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926258 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926335 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926935 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.926986 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:09 crc kubenswrapper[4883]: I0310 09:47:09.947827 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"redhat-marketplace-hnmd7\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:10 crc kubenswrapper[4883]: I0310 09:47:10.037026 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:10 crc kubenswrapper[4883]: I0310 09:47:10.444863 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.254957 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" exitCode=0 Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.255066 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00"} Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.255372 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"fb137214e3717144cbad13264c0b2115ce2602654d08f4640418db6656487038"} Mar 10 09:47:11 crc kubenswrapper[4883]: I0310 09:47:11.258271 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:47:12 crc kubenswrapper[4883]: I0310 09:47:12.265568 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} Mar 10 09:47:13 crc kubenswrapper[4883]: I0310 09:47:13.275050 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" exitCode=0 Mar 10 09:47:13 crc kubenswrapper[4883]: I0310 09:47:13.275101 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} Mar 10 09:47:14 crc kubenswrapper[4883]: I0310 09:47:14.289228 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerStarted","Data":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} Mar 10 09:47:14 crc kubenswrapper[4883]: I0310 09:47:14.306697 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnmd7" podStartSLOduration=2.807112862 podStartE2EDuration="5.306675575s" podCreationTimestamp="2026-03-10 09:47:09 +0000 UTC" firstStartedPulling="2026-03-10 09:47:11.25790419 +0000 UTC m=+2617.512802078" lastFinishedPulling="2026-03-10 09:47:13.757466892 +0000 UTC m=+2620.012364791" observedRunningTime="2026-03-10 09:47:14.306584303 +0000 UTC m=+2620.561482192" watchObservedRunningTime="2026-03-10 09:47:14.306675575 +0000 UTC m=+2620.561573465" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.037837 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.038683 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.089129 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.391409 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:20 crc kubenswrapper[4883]: I0310 09:47:20.441489 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:22 crc kubenswrapper[4883]: I0310 09:47:22.371138 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnmd7" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" containerID="cri-o://9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" gracePeriod=2 Mar 10 09:47:22 crc kubenswrapper[4883]: I0310 09:47:22.819785 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022119 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022266 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022314 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") pod \"c2cfe728-0af8-40ab-9378-8567163d6489\" (UID: \"c2cfe728-0af8-40ab-9378-8567163d6489\") " Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.022833 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities" (OuterVolumeSpecName: "utilities") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.029803 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7" (OuterVolumeSpecName: "kube-api-access-p6dl7") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "kube-api-access-p6dl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.056079 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2cfe728-0af8-40ab-9378-8567163d6489" (UID: "c2cfe728-0af8-40ab-9378-8567163d6489"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124737 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124773 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cfe728-0af8-40ab-9378-8567163d6489-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.124787 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6dl7\" (UniqueName: \"kubernetes.io/projected/c2cfe728-0af8-40ab-9378-8567163d6489-kube-api-access-p6dl7\") on node \"crc\" DevicePath \"\"" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384267 4883 generic.go:334] "Generic (PLEG): container finished" podID="c2cfe728-0af8-40ab-9378-8567163d6489" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" exitCode=0 Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384449 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384713 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnmd7" event={"ID":"c2cfe728-0af8-40ab-9378-8567163d6489","Type":"ContainerDied","Data":"fb137214e3717144cbad13264c0b2115ce2602654d08f4640418db6656487038"} Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384746 4883 scope.go:117] "RemoveContainer" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.384604 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnmd7" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.405639 4883 scope.go:117] "RemoveContainer" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.416607 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.421702 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnmd7"] Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.438370 4883 scope.go:117] "RemoveContainer" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.456531 4883 scope.go:117] "RemoveContainer" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457028 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": container with ID starting with 9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480 not found: ID does not exist" containerID="9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457065 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480"} err="failed to get container status \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": rpc error: code = NotFound desc = could not find container \"9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480\": container with ID starting with 9e170a5cd7b9b140c2d74f593d6a7a32e48945c0ad9d8fd8adb981a3777ad480 not found: ID does not exist" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457093 4883 scope.go:117] "RemoveContainer" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457385 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": container with ID starting with e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875 not found: ID does not exist" containerID="e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457416 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875"} err="failed to get container status \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": rpc error: code = NotFound desc = could not find container \"e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875\": container with ID starting with e0483017415b7096c677727b01183bfce1190486d7b6b3b20d2ccbb1943a9875 not found: ID does not exist" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457439 4883 scope.go:117] "RemoveContainer" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: E0310 09:47:23.457896 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": container with ID starting with 83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00 not found: ID does not exist" containerID="83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00" Mar 10 09:47:23 crc kubenswrapper[4883]: I0310 09:47:23.457932 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00"} err="failed to get container status \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": rpc error: code = NotFound desc = could not find container \"83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00\": container with ID starting with 83f17a59751194aa71c40d12ec9d3afb6c8f3ca78749c25d2c68f98eb6b9df00 not found: ID does not exist" Mar 10 09:47:24 crc kubenswrapper[4883]: I0310 09:47:24.093780 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" path="/var/lib/kubelet/pods/c2cfe728-0af8-40ab-9378-8567163d6489/volumes" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.140978 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142400 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142419 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142440 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-content" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142447 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-content" Mar 10 09:48:00 crc kubenswrapper[4883]: E0310 09:48:00.142493 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-utilities" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142500 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="extract-utilities" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.142747 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cfe728-0af8-40ab-9378-8567163d6489" containerName="registry-server" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.143527 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148584 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148628 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.148777 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.149785 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.267177 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.368624 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.386689 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"auto-csr-approver-29552268-mwcmv\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.459393 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:00 crc kubenswrapper[4883]: I0310 09:48:00.856686 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:48:01 crc kubenswrapper[4883]: I0310 09:48:01.728901 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerStarted","Data":"6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a"} Mar 10 09:48:02 crc kubenswrapper[4883]: I0310 09:48:02.739920 4883 generic.go:334] "Generic (PLEG): container finished" podID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerID="cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52" exitCode=0 Mar 10 09:48:02 crc kubenswrapper[4883]: I0310 09:48:02.740026 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerDied","Data":"cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52"} Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.056920 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.251607 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") pod \"4e44dc59-cae3-44ee-87bf-2b85d5850682\" (UID: \"4e44dc59-cae3-44ee-87bf-2b85d5850682\") " Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.257147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7" (OuterVolumeSpecName: "kube-api-access-j8qm7") pod "4e44dc59-cae3-44ee-87bf-2b85d5850682" (UID: "4e44dc59-cae3-44ee-87bf-2b85d5850682"). InnerVolumeSpecName "kube-api-access-j8qm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.353993 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8qm7\" (UniqueName: \"kubernetes.io/projected/4e44dc59-cae3-44ee-87bf-2b85d5850682-kube-api-access-j8qm7\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.756861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" event={"ID":"4e44dc59-cae3-44ee-87bf-2b85d5850682","Type":"ContainerDied","Data":"6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a"} Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.757199 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6394610a49512c2011f25bcc020f3a3af9a16665fd86bd626d17107c1489da0a" Mar 10 09:48:04 crc kubenswrapper[4883]: I0310 09:48:04.756915 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552268-mwcmv" Mar 10 09:48:05 crc kubenswrapper[4883]: I0310 09:48:05.125064 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:48:05 crc kubenswrapper[4883]: I0310 09:48:05.138385 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552262-5shjs"] Mar 10 09:48:06 crc kubenswrapper[4883]: I0310 09:48:06.088689 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7f73ce-4bec-451b-8fc7-a787366b6001" path="/var/lib/kubelet/pods/5a7f73ce-4bec-451b-8fc7-a787366b6001/volumes" Mar 10 09:48:06 crc kubenswrapper[4883]: I0310 09:48:06.330246 4883 scope.go:117] "RemoveContainer" containerID="cbfa56661b259829721fddd0e6afd1c57e6fb0cd59f60d7658cb1739ef3cba81" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.289821 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:45 crc kubenswrapper[4883]: E0310 09:48:45.291050 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.291065 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.291283 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" containerName="oc" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.292955 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.304027 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345002 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345118 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.345554 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448109 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448382 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.448453 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.449029 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.449098 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.468841 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"community-operators-bjqqv\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:45 crc kubenswrapper[4883]: I0310 09:48:45.613070 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:46 crc kubenswrapper[4883]: I0310 09:48:46.127116 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113626 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" exitCode=0 Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113703 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab"} Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.113974 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"c1efafaa06cf0a95cadd9a4584b4c6a8a5ceb29a673366e3e20b9973500195f0"} Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.448978 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:48:47 crc kubenswrapper[4883]: I0310 09:48:47.449029 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:48:48 crc kubenswrapper[4883]: I0310 09:48:48.123289 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} Mar 10 09:48:49 crc kubenswrapper[4883]: I0310 09:48:49.139361 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" exitCode=0 Mar 10 09:48:49 crc kubenswrapper[4883]: I0310 09:48:49.139412 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} Mar 10 09:48:50 crc kubenswrapper[4883]: I0310 09:48:50.151495 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerStarted","Data":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} Mar 10 09:48:50 crc kubenswrapper[4883]: I0310 09:48:50.176455 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bjqqv" podStartSLOduration=2.666995884 podStartE2EDuration="5.176428908s" podCreationTimestamp="2026-03-10 09:48:45 +0000 UTC" firstStartedPulling="2026-03-10 09:48:47.116151045 +0000 UTC m=+2713.371048934" lastFinishedPulling="2026-03-10 09:48:49.62558407 +0000 UTC m=+2715.880481958" observedRunningTime="2026-03-10 09:48:50.170510889 +0000 UTC m=+2716.425408778" watchObservedRunningTime="2026-03-10 09:48:50.176428908 +0000 UTC m=+2716.431326797" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.613946 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.614593 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:55 crc kubenswrapper[4883]: I0310 09:48:55.657358 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:56 crc kubenswrapper[4883]: I0310 09:48:56.256518 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:56 crc kubenswrapper[4883]: I0310 09:48:56.327662 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.229922 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bjqqv" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" containerID="cri-o://1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" gracePeriod=2 Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.662048 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.811890 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.812364 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.812436 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") pod \"42973d9a-2054-4a79-b789-8dfba272a471\" (UID: \"42973d9a-2054-4a79-b789-8dfba272a471\") " Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.813615 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities" (OuterVolumeSpecName: "utilities") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.821166 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd" (OuterVolumeSpecName: "kube-api-access-zlqkd") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "kube-api-access-zlqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.856621 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42973d9a-2054-4a79-b789-8dfba272a471" (UID: "42973d9a-2054-4a79-b789-8dfba272a471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914918 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlqkd\" (UniqueName: \"kubernetes.io/projected/42973d9a-2054-4a79-b789-8dfba272a471-kube-api-access-zlqkd\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914949 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:58 crc kubenswrapper[4883]: I0310 09:48:58.914960 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42973d9a-2054-4a79-b789-8dfba272a471-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243580 4883 generic.go:334] "Generic (PLEG): container finished" podID="42973d9a-2054-4a79-b789-8dfba272a471" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" exitCode=0 Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243662 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243723 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjqqv" event={"ID":"42973d9a-2054-4a79-b789-8dfba272a471","Type":"ContainerDied","Data":"c1efafaa06cf0a95cadd9a4584b4c6a8a5ceb29a673366e3e20b9973500195f0"} Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243751 4883 scope.go:117] "RemoveContainer" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.243745 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjqqv" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.273243 4883 scope.go:117] "RemoveContainer" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.273265 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.282113 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bjqqv"] Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.303098 4883 scope.go:117] "RemoveContainer" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.323953 4883 scope.go:117] "RemoveContainer" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.324377 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": container with ID starting with 1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35 not found: ID does not exist" containerID="1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324429 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35"} err="failed to get container status \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": rpc error: code = NotFound desc = could not find container \"1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35\": container with ID starting with 1de9518cb6e59010ae05b7268fff78682d1dd6503c9109d89d131fa2c3485f35 not found: ID does not exist" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324463 4883 scope.go:117] "RemoveContainer" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.324754 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": container with ID starting with f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644 not found: ID does not exist" containerID="f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324782 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644"} err="failed to get container status \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": rpc error: code = NotFound desc = could not find container \"f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644\": container with ID starting with f49fe694c99905bb8d09628f2d233a51862ef1e801a652fb3892200510863644 not found: ID does not exist" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.324800 4883 scope.go:117] "RemoveContainer" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: E0310 09:48:59.325011 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": container with ID starting with a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab not found: ID does not exist" containerID="a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab" Mar 10 09:48:59 crc kubenswrapper[4883]: I0310 09:48:59.325034 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab"} err="failed to get container status \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": rpc error: code = NotFound desc = could not find container \"a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab\": container with ID starting with a61d98ad0299c497825c854409970281ae213ffbceb76c765cbe663f1fae2fab not found: ID does not exist" Mar 10 09:49:00 crc kubenswrapper[4883]: I0310 09:49:00.090526 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42973d9a-2054-4a79-b789-8dfba272a471" path="/var/lib/kubelet/pods/42973d9a-2054-4a79-b789-8dfba272a471/volumes" Mar 10 09:49:17 crc kubenswrapper[4883]: I0310 09:49:17.449043 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:49:17 crc kubenswrapper[4883]: I0310 09:49:17.449661 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.449559 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.450337 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.450413 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.451415 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.451496 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" gracePeriod=600 Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.675725 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" exitCode=0 Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.675785 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef"} Mar 10 09:49:47 crc kubenswrapper[4883]: I0310 09:49:47.676028 4883 scope.go:117] "RemoveContainer" containerID="cd289bea8d86501abc37d9bdb3d1a2064a8db57dcbb257893c1ec8f40885b22b" Mar 10 09:49:48 crc kubenswrapper[4883]: I0310 09:49:48.688040 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.146834 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148047 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148061 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148087 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-content" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148094 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-content" Mar 10 09:50:00 crc kubenswrapper[4883]: E0310 09:50:00.148134 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-utilities" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148140 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="extract-utilities" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.148447 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="42973d9a-2054-4a79-b789-8dfba272a471" containerName="registry-server" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.149450 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.152148 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.152262 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.153327 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.158690 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.244387 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.345888 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.364598 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"auto-csr-approver-29552270-wwj5c\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.473152 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:00 crc kubenswrapper[4883]: I0310 09:50:00.891209 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:50:01 crc kubenswrapper[4883]: I0310 09:50:01.823107 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerStarted","Data":"f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69"} Mar 10 09:50:02 crc kubenswrapper[4883]: I0310 09:50:02.835966 4883 generic.go:334] "Generic (PLEG): container finished" podID="c376c647-e032-465a-8abc-e8ae35219822" containerID="d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9" exitCode=0 Mar 10 09:50:02 crc kubenswrapper[4883]: I0310 09:50:02.836081 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerDied","Data":"d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9"} Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.190872 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.227721 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") pod \"c376c647-e032-465a-8abc-e8ae35219822\" (UID: \"c376c647-e032-465a-8abc-e8ae35219822\") " Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.233184 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg" (OuterVolumeSpecName: "kube-api-access-6r5fg") pod "c376c647-e032-465a-8abc-e8ae35219822" (UID: "c376c647-e032-465a-8abc-e8ae35219822"). InnerVolumeSpecName "kube-api-access-6r5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.331337 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r5fg\" (UniqueName: \"kubernetes.io/projected/c376c647-e032-465a-8abc-e8ae35219822-kube-api-access-6r5fg\") on node \"crc\" DevicePath \"\"" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" event={"ID":"c376c647-e032-465a-8abc-e8ae35219822","Type":"ContainerDied","Data":"f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69"} Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858363 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552270-wwj5c" Mar 10 09:50:04 crc kubenswrapper[4883]: I0310 09:50:04.858375 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f82360dd0b039ebdd0bd386bf762895c7a1cce10ae7224ac6e29e92d41a28d69" Mar 10 09:50:05 crc kubenswrapper[4883]: I0310 09:50:05.258875 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:50:05 crc kubenswrapper[4883]: I0310 09:50:05.265675 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552264-fxbzh"] Mar 10 09:50:06 crc kubenswrapper[4883]: I0310 09:50:06.088285 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7561a55-c8cc-4fad-99cf-6a81612efa5f" path="/var/lib/kubelet/pods/e7561a55-c8cc-4fad-99cf-6a81612efa5f/volumes" Mar 10 09:50:06 crc kubenswrapper[4883]: I0310 09:50:06.440468 4883 scope.go:117] "RemoveContainer" containerID="7a20c7b029586fbd175b90818495ae7de2932811c342c11864ee48f481c0032f" Mar 10 09:51:47 crc kubenswrapper[4883]: I0310 09:51:47.449155 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:51:47 crc kubenswrapper[4883]: I0310 09:51:47.449839 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.597933 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:51 crc kubenswrapper[4883]: E0310 09:51:51.598916 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.598932 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.599176 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="c376c647-e032-465a-8abc-e8ae35219822" containerName="oc" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.600605 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.609834 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610051 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610079 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.610858 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712017 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712183 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712206 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712767 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.712812 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.730002 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"redhat-operators-2zqxs\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:51 crc kubenswrapper[4883]: I0310 09:51:51.918884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.328636 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781259 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" exitCode=0 Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781307 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481"} Mar 10 09:51:52 crc kubenswrapper[4883]: I0310 09:51:52.781334 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"1616cc638f92b5334f353a113714aba9530cd72972f5516666f36f97cb2cc4cc"} Mar 10 09:51:53 crc kubenswrapper[4883]: I0310 09:51:53.795173 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} Mar 10 09:51:54 crc kubenswrapper[4883]: E0310 09:51:54.426077 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91051722_2538_461e_bacc_795d4c2dd312.slice/crio-21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:51:56 crc kubenswrapper[4883]: I0310 09:51:56.821104 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" exitCode=0 Mar 10 09:51:56 crc kubenswrapper[4883]: I0310 09:51:56.821154 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} Mar 10 09:51:57 crc kubenswrapper[4883]: I0310 09:51:57.835996 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerStarted","Data":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} Mar 10 09:51:57 crc kubenswrapper[4883]: I0310 09:51:57.854203 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2zqxs" podStartSLOduration=2.352187888 podStartE2EDuration="6.854162674s" podCreationTimestamp="2026-03-10 09:51:51 +0000 UTC" firstStartedPulling="2026-03-10 09:51:52.783558405 +0000 UTC m=+2899.038456294" lastFinishedPulling="2026-03-10 09:51:57.285533191 +0000 UTC m=+2903.540431080" observedRunningTime="2026-03-10 09:51:57.852099894 +0000 UTC m=+2904.106997783" watchObservedRunningTime="2026-03-10 09:51:57.854162674 +0000 UTC m=+2904.109060563" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.144414 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.146548 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150033 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150057 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.150757 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.151506 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.186676 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.288319 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.308534 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"auto-csr-approver-29552272-7b8fg\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.467418 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:00 crc kubenswrapper[4883]: I0310 09:52:00.916242 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:52:00 crc kubenswrapper[4883]: W0310 09:52:00.917138 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd9f896_b725_4a44_825a_9fd728da26b2.slice/crio-2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403 WatchSource:0}: Error finding container 2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403: Status 404 returned error can't find the container with id 2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403 Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.886912 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerStarted","Data":"2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403"} Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.919768 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:01 crc kubenswrapper[4883]: I0310 09:52:01.920492 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.897387 4883 generic.go:334] "Generic (PLEG): container finished" podID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerID="55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c" exitCode=0 Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.897563 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerDied","Data":"55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c"} Mar 10 09:52:02 crc kubenswrapper[4883]: I0310 09:52:02.957256 4883 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2zqxs" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" probeResult="failure" output=< Mar 10 09:52:02 crc kubenswrapper[4883]: timeout: failed to connect service ":50051" within 1s Mar 10 09:52:02 crc kubenswrapper[4883]: > Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.201218 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.275953 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") pod \"4fd9f896-b725-4a44-825a-9fd728da26b2\" (UID: \"4fd9f896-b725-4a44-825a-9fd728da26b2\") " Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.280685 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv" (OuterVolumeSpecName: "kube-api-access-ngtqv") pod "4fd9f896-b725-4a44-825a-9fd728da26b2" (UID: "4fd9f896-b725-4a44-825a-9fd728da26b2"). InnerVolumeSpecName "kube-api-access-ngtqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.379609 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngtqv\" (UniqueName: \"kubernetes.io/projected/4fd9f896-b725-4a44-825a-9fd728da26b2-kube-api-access-ngtqv\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919487 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" event={"ID":"4fd9f896-b725-4a44-825a-9fd728da26b2","Type":"ContainerDied","Data":"2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403"} Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919545 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4d960dfde81a150d70c8a5478d637f7a81810b04283e80c86ca52c8b791403" Mar 10 09:52:04 crc kubenswrapper[4883]: I0310 09:52:04.919542 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552272-7b8fg" Mar 10 09:52:05 crc kubenswrapper[4883]: I0310 09:52:05.268202 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:52:05 crc kubenswrapper[4883]: I0310 09:52:05.277205 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552266-685zg"] Mar 10 09:52:06 crc kubenswrapper[4883]: I0310 09:52:06.090305 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a46f17b-70fa-415b-a58a-05fabe683062" path="/var/lib/kubelet/pods/3a46f17b-70fa-415b-a58a-05fabe683062/volumes" Mar 10 09:52:06 crc kubenswrapper[4883]: I0310 09:52:06.537182 4883 scope.go:117] "RemoveContainer" containerID="8f1431de5f428e41dddc47217c2e968809dd1f4b2b6ca77bcaf70fa3ca340a9d" Mar 10 09:52:11 crc kubenswrapper[4883]: I0310 09:52:11.959921 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:11 crc kubenswrapper[4883]: I0310 09:52:11.999324 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:12 crc kubenswrapper[4883]: I0310 09:52:12.192848 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.001738 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2zqxs" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" containerID="cri-o://0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" gracePeriod=2 Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.424519 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.527932 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.528139 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.528195 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") pod \"91051722-2538-461e-bacc-795d4c2dd312\" (UID: \"91051722-2538-461e-bacc-795d4c2dd312\") " Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.529301 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities" (OuterVolumeSpecName: "utilities") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.534067 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p" (OuterVolumeSpecName: "kube-api-access-vv88p") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "kube-api-access-vv88p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.624374 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91051722-2538-461e-bacc-795d4c2dd312" (UID: "91051722-2538-461e-bacc-795d4c2dd312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631870 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631909 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91051722-2538-461e-bacc-795d4c2dd312-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:13 crc kubenswrapper[4883]: I0310 09:52:13.631927 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv88p\" (UniqueName: \"kubernetes.io/projected/91051722-2538-461e-bacc-795d4c2dd312-kube-api-access-vv88p\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014153 4883 generic.go:334] "Generic (PLEG): container finished" podID="91051722-2538-461e-bacc-795d4c2dd312" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" exitCode=0 Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014213 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014248 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zqxs" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014268 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zqxs" event={"ID":"91051722-2538-461e-bacc-795d4c2dd312","Type":"ContainerDied","Data":"1616cc638f92b5334f353a113714aba9530cd72972f5516666f36f97cb2cc4cc"} Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.014291 4883 scope.go:117] "RemoveContainer" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.051009 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.051606 4883 scope.go:117] "RemoveContainer" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.060679 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2zqxs"] Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.074569 4883 scope.go:117] "RemoveContainer" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.091726 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91051722-2538-461e-bacc-795d4c2dd312" path="/var/lib/kubelet/pods/91051722-2538-461e-bacc-795d4c2dd312/volumes" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.105618 4883 scope.go:117] "RemoveContainer" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.106360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": container with ID starting with 0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49 not found: ID does not exist" containerID="0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.106402 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49"} err="failed to get container status \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": rpc error: code = NotFound desc = could not find container \"0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49\": container with ID starting with 0230e8654a16cf5d2de59d7617579cc80838d0b67943f4692a41e441c2904f49 not found: ID does not exist" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.106441 4883 scope.go:117] "RemoveContainer" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.107066 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": container with ID starting with 21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1 not found: ID does not exist" containerID="21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107112 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1"} err="failed to get container status \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": rpc error: code = NotFound desc = could not find container \"21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1\": container with ID starting with 21b1a3cbea3bd9f7a592918da9acdd3863a64a8e3319c033eb5ab93e33b147e1 not found: ID does not exist" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107159 4883 scope.go:117] "RemoveContainer" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: E0310 09:52:14.107678 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": container with ID starting with 762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481 not found: ID does not exist" containerID="762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481" Mar 10 09:52:14 crc kubenswrapper[4883]: I0310 09:52:14.107707 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481"} err="failed to get container status \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": rpc error: code = NotFound desc = could not find container \"762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481\": container with ID starting with 762590f870edc9bb529bf1385e6d34544ccf9274431ad4affe0e2c6a77b13481 not found: ID does not exist" Mar 10 09:52:17 crc kubenswrapper[4883]: I0310 09:52:17.449160 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:17 crc kubenswrapper[4883]: I0310 09:52:17.449545 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.852541 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853644 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853662 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853694 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853700 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853743 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-content" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853749 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-content" Mar 10 09:52:32 crc kubenswrapper[4883]: E0310 09:52:32.853760 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-utilities" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853766 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="extract-utilities" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.853990 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="91051722-2538-461e-bacc-795d4c2dd312" containerName="registry-server" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.854004 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" containerName="oc" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.855462 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.860123 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974080 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:32 crc kubenswrapper[4883]: I0310 09:52:32.974695 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077410 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077528 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.077661 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.078210 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.078285 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.097780 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"certified-operators-tlmsb\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.173927 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:33 crc kubenswrapper[4883]: I0310 09:52:33.609528 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.187706 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" exitCode=0 Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.187923 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f"} Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.188737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerStarted","Data":"2d644c394c26845e44dd788d0cb0b57d0709245acf92637ed359eb376459943a"} Mar 10 09:52:34 crc kubenswrapper[4883]: I0310 09:52:34.190015 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:52:35 crc kubenswrapper[4883]: I0310 09:52:35.198927 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" exitCode=0 Mar 10 09:52:35 crc kubenswrapper[4883]: I0310 09:52:35.199016 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3"} Mar 10 09:52:35 crc kubenswrapper[4883]: E0310 09:52:35.274882 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2beac4_fd7d_47e1_89c3_27c1490ee6b1.slice/crio-conmon-ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:52:36 crc kubenswrapper[4883]: I0310 09:52:36.209111 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerStarted","Data":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} Mar 10 09:52:36 crc kubenswrapper[4883]: I0310 09:52:36.232698 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tlmsb" podStartSLOduration=2.678836044 podStartE2EDuration="4.232679198s" podCreationTimestamp="2026-03-10 09:52:32 +0000 UTC" firstStartedPulling="2026-03-10 09:52:34.189695843 +0000 UTC m=+2940.444593742" lastFinishedPulling="2026-03-10 09:52:35.743539007 +0000 UTC m=+2941.998436896" observedRunningTime="2026-03-10 09:52:36.224790248 +0000 UTC m=+2942.479688138" watchObservedRunningTime="2026-03-10 09:52:36.232679198 +0000 UTC m=+2942.487577087" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.174403 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.174811 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.211334 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.303832 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:43 crc kubenswrapper[4883]: I0310 09:52:43.443458 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:45 crc kubenswrapper[4883]: I0310 09:52:45.286697 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tlmsb" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" containerID="cri-o://d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" gracePeriod=2 Mar 10 09:52:45 crc kubenswrapper[4883]: E0310 09:52:45.475161 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2beac4_fd7d_47e1_89c3_27c1490ee6b1.slice/crio-d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.289346 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294693 4883 generic.go:334] "Generic (PLEG): container finished" podID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" exitCode=0 Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294737 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294762 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlmsb" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294782 4883 scope.go:117] "RemoveContainer" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.294771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlmsb" event={"ID":"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1","Type":"ContainerDied","Data":"2d644c394c26845e44dd788d0cb0b57d0709245acf92637ed359eb376459943a"} Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.313035 4883 scope.go:117] "RemoveContainer" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.330871 4883 scope.go:117] "RemoveContainer" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.369572 4883 scope.go:117] "RemoveContainer" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.371979 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": container with ID starting with d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1 not found: ID does not exist" containerID="d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372030 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1"} err="failed to get container status \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": rpc error: code = NotFound desc = could not find container \"d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1\": container with ID starting with d520a4be3e07838a04f2d3057e9b99e6b0c79d6e8d66ff3502984ca64fc05ca1 not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372064 4883 scope.go:117] "RemoveContainer" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.372813 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": container with ID starting with ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3 not found: ID does not exist" containerID="ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372854 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3"} err="failed to get container status \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": rpc error: code = NotFound desc = could not find container \"ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3\": container with ID starting with ebf3dae86ebc76d8833f2472cbd58717ee7c50ddfa5c562fb757b5b3672569f3 not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.372900 4883 scope.go:117] "RemoveContainer" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: E0310 09:52:46.373305 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": container with ID starting with b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f not found: ID does not exist" containerID="b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.373329 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f"} err="failed to get container status \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": rpc error: code = NotFound desc = could not find container \"b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f\": container with ID starting with b6b336e16862c561f1c1258508df7157126d93a784424aae456095a70cedfb4f not found: ID does not exist" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430512 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430580 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.430743 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") pod \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\" (UID: \"bf2beac4-fd7d-47e1-89c3-27c1490ee6b1\") " Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.431614 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities" (OuterVolumeSpecName: "utilities") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.436041 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8" (OuterVolumeSpecName: "kube-api-access-kknn8") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "kube-api-access-kknn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.474384 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" (UID: "bf2beac4-fd7d-47e1-89c3-27c1490ee6b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533899 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533930 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kknn8\" (UniqueName: \"kubernetes.io/projected/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-kube-api-access-kknn8\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.533943 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.623801 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:46 crc kubenswrapper[4883]: I0310 09:52:46.630213 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tlmsb"] Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.448995 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449295 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449376 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.449992 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:52:47 crc kubenswrapper[4883]: I0310 09:52:47.450060 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" gracePeriod=600 Mar 10 09:52:47 crc kubenswrapper[4883]: E0310 09:52:47.571359 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.090803 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" path="/var/lib/kubelet/pods/bf2beac4-fd7d-47e1-89c3-27c1490ee6b1/volumes" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313216 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" exitCode=0 Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313304 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f"} Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.313464 4883 scope.go:117] "RemoveContainer" containerID="42db6b75ced608253f11bc996fd1ec66cf10171b52105327ab8dabc2d2a5fbef" Mar 10 09:52:48 crc kubenswrapper[4883]: I0310 09:52:48.314359 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:52:48 crc kubenswrapper[4883]: E0310 09:52:48.314679 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:52:59 crc kubenswrapper[4883]: I0310 09:52:59.080795 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:52:59 crc kubenswrapper[4883]: E0310 09:52:59.081765 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:11 crc kubenswrapper[4883]: I0310 09:53:11.080241 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:11 crc kubenswrapper[4883]: E0310 09:53:11.081341 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:23 crc kubenswrapper[4883]: I0310 09:53:23.080385 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:23 crc kubenswrapper[4883]: E0310 09:53:23.081343 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:37 crc kubenswrapper[4883]: I0310 09:53:37.079681 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:37 crc kubenswrapper[4883]: E0310 09:53:37.080612 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:53:52 crc kubenswrapper[4883]: I0310 09:53:52.081693 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:53:52 crc kubenswrapper[4883]: E0310 09:53:52.083003 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.141972 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142953 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-content" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.142965 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-content" Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142980 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.142988 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: E0310 09:54:00.142995 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-utilities" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143001 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="extract-utilities" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143178 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2beac4-fd7d-47e1-89c3-27c1490ee6b1" containerName="registry-server" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.143864 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147006 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147291 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147346 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.147409 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.165797 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.266967 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.284373 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"auto-csr-approver-29552274-p9wph\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.468698 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.867644 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 09:54:00 crc kubenswrapper[4883]: I0310 09:54:00.934028 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerStarted","Data":"a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe"} Mar 10 09:54:02 crc kubenswrapper[4883]: I0310 09:54:02.969016 4883 generic.go:334] "Generic (PLEG): container finished" podID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerID="37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d" exitCode=0 Mar 10 09:54:02 crc kubenswrapper[4883]: I0310 09:54:02.969217 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerDied","Data":"37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d"} Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.323315 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.358275 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") pod \"05d7064d-bd5b-4775-ab8f-2d5780f76440\" (UID: \"05d7064d-bd5b-4775-ab8f-2d5780f76440\") " Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.365492 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9" (OuterVolumeSpecName: "kube-api-access-7gmm9") pod "05d7064d-bd5b-4775-ab8f-2d5780f76440" (UID: "05d7064d-bd5b-4775-ab8f-2d5780f76440"). InnerVolumeSpecName "kube-api-access-7gmm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:54:04 crc kubenswrapper[4883]: I0310 09:54:04.461184 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gmm9\" (UniqueName: \"kubernetes.io/projected/05d7064d-bd5b-4775-ab8f-2d5780f76440-kube-api-access-7gmm9\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002153 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552274-p9wph" event={"ID":"05d7064d-bd5b-4775-ab8f-2d5780f76440","Type":"ContainerDied","Data":"a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe"} Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002200 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552274-p9wph" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.002219 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99d8f0ac49315b672c2118f1d6221fe71f4e07f66807a02b130f45f33fa72fe" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.080281 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:05 crc kubenswrapper[4883]: E0310 09:54:05.080596 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.395151 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:54:05 crc kubenswrapper[4883]: I0310 09:54:05.400262 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552268-mwcmv"] Mar 10 09:54:06 crc kubenswrapper[4883]: I0310 09:54:06.092545 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e44dc59-cae3-44ee-87bf-2b85d5850682" path="/var/lib/kubelet/pods/4e44dc59-cae3-44ee-87bf-2b85d5850682/volumes" Mar 10 09:54:06 crc kubenswrapper[4883]: I0310 09:54:06.643492 4883 scope.go:117] "RemoveContainer" containerID="cae180ac84c542adc5e640936e665d82183bbb27f65b7e1e59e92d435d368a52" Mar 10 09:54:18 crc kubenswrapper[4883]: I0310 09:54:18.080639 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:18 crc kubenswrapper[4883]: E0310 09:54:18.081461 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:31 crc kubenswrapper[4883]: I0310 09:54:31.079729 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:31 crc kubenswrapper[4883]: E0310 09:54:31.080699 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:45 crc kubenswrapper[4883]: I0310 09:54:45.080343 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:45 crc kubenswrapper[4883]: E0310 09:54:45.081267 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:54:49 crc kubenswrapper[4883]: I0310 09:54:49.354297 4883 generic.go:334] "Generic (PLEG): container finished" podID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerID="20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f" exitCode=0 Mar 10 09:54:49 crc kubenswrapper[4883]: I0310 09:54:49.354389 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerDied","Data":"20faf1bc2dd52b1aabee2636feb1570644b5e51b82c37399b21f107d33a5382f"} Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.651063 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799655 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799720 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799755 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799785 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799816 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799836 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799892 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.799971 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") pod \"d483d791-15b3-49e7-8095-5660a9d0fdaa\" (UID: \"d483d791-15b3-49e7-8095-5660a9d0fdaa\") " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.800383 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.800760 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data" (OuterVolumeSpecName: "config-data") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.805982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.806444 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.811512 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq" (OuterVolumeSpecName: "kube-api-access-rczsq") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "kube-api-access-rczsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.831408 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.844347 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.845195 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.846989 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d483d791-15b3-49e7-8095-5660a9d0fdaa" (UID: "d483d791-15b3-49e7-8095-5660a9d0fdaa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902849 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczsq\" (UniqueName: \"kubernetes.io/projected/d483d791-15b3-49e7-8095-5660a9d0fdaa-kube-api-access-rczsq\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902882 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902893 4883 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902903 4883 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902934 4883 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902943 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902951 4883 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902960 4883 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d483d791-15b3-49e7-8095-5660a9d0fdaa-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.902968 4883 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d483d791-15b3-49e7-8095-5660a9d0fdaa-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:50 crc kubenswrapper[4883]: I0310 09:54:50.917437 4883 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.005287 4883 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374210 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d483d791-15b3-49e7-8095-5660a9d0fdaa","Type":"ContainerDied","Data":"5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286"} Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374253 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9e0098f227f9af35dc0b77276abeb28187e6a5424e5047b3daacd6cc5a8286" Mar 10 09:54:51 crc kubenswrapper[4883]: I0310 09:54:51.374516 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:54:59 crc kubenswrapper[4883]: I0310 09:54:59.080060 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:54:59 crc kubenswrapper[4883]: E0310 09:54:59.081163 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.247939 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:00 crc kubenswrapper[4883]: E0310 09:55:00.248386 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248406 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: E0310 09:55:00.248448 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248454 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248703 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" containerName="oc" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.248724 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d483d791-15b3-49e7-8095-5660a9d0fdaa" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.249441 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.251852 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fm4md" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.254950 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.362734 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.362855 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.464893 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.465120 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.465302 4883 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.482189 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85x2\" (UniqueName: \"kubernetes.io/projected/4d76dec9-afd2-4850-aacb-c8d60819fc1e-kube-api-access-n85x2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.488279 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4d76dec9-afd2-4850-aacb-c8d60819fc1e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.566881 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:55:00 crc kubenswrapper[4883]: I0310 09:55:00.962119 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:55:01 crc kubenswrapper[4883]: I0310 09:55:01.457538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4d76dec9-afd2-4850-aacb-c8d60819fc1e","Type":"ContainerStarted","Data":"fb012b78fec124e385c7eee81b71f76f1eab5053ff907510140e23e830544a46"} Mar 10 09:55:02 crc kubenswrapper[4883]: I0310 09:55:02.475902 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4d76dec9-afd2-4850-aacb-c8d60819fc1e","Type":"ContainerStarted","Data":"9f3c8731c0248aa3b4b8e965278c9cc8b6078c54b8d2341e9ffa46e5dc4eafd6"} Mar 10 09:55:02 crc kubenswrapper[4883]: I0310 09:55:02.496590 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.5190385979999999 podStartE2EDuration="2.496574936s" podCreationTimestamp="2026-03-10 09:55:00 +0000 UTC" firstStartedPulling="2026-03-10 09:55:00.970066023 +0000 UTC m=+3087.224963912" lastFinishedPulling="2026-03-10 09:55:01.947602361 +0000 UTC m=+3088.202500250" observedRunningTime="2026-03-10 09:55:02.489401685 +0000 UTC m=+3088.744299575" watchObservedRunningTime="2026-03-10 09:55:02.496574936 +0000 UTC m=+3088.751472825" Mar 10 09:55:10 crc kubenswrapper[4883]: I0310 09:55:10.080831 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:10 crc kubenswrapper[4883]: E0310 09:55:10.081825 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.294866 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.297050 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.301744 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9vgtk"/"openshift-service-ca.crt" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.301967 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9vgtk"/"kube-root-ca.crt" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.322625 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.438627 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.439201 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540391 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540531 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.540986 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.563218 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"must-gather-bkkgz\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:20 crc kubenswrapper[4883]: I0310 09:55:20.614254 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 09:55:21 crc kubenswrapper[4883]: I0310 09:55:21.077821 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 09:55:21 crc kubenswrapper[4883]: I0310 09:55:21.684063 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"6595242d952bf7e089a911d59e9a0a5d5af6e08fb65750a6f303235b58a953f9"} Mar 10 09:55:23 crc kubenswrapper[4883]: I0310 09:55:23.080248 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:23 crc kubenswrapper[4883]: E0310 09:55:23.080887 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.736616 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615"} Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.736961 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerStarted","Data":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} Mar 10 09:55:27 crc kubenswrapper[4883]: I0310 09:55:27.759412 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" podStartSLOduration=2.262921114 podStartE2EDuration="7.759391325s" podCreationTimestamp="2026-03-10 09:55:20 +0000 UTC" firstStartedPulling="2026-03-10 09:55:21.088402145 +0000 UTC m=+3107.343300034" lastFinishedPulling="2026-03-10 09:55:26.584872356 +0000 UTC m=+3112.839770245" observedRunningTime="2026-03-10 09:55:27.756230096 +0000 UTC m=+3114.011127985" watchObservedRunningTime="2026-03-10 09:55:27.759391325 +0000 UTC m=+3114.014289214" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.890361 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.891912 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.893772 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9vgtk"/"default-dockercfg-7b2hr" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.981189 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:29 crc kubenswrapper[4883]: I0310 09:55:29.981516 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082282 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082341 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.082532 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.101464 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"crc-debug-rz69b\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.209578 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:55:30 crc kubenswrapper[4883]: W0310 09:55:30.241563 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a396c68_b7c1_4eda_abc2_563cdd15fee3.slice/crio-67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce WatchSource:0}: Error finding container 67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce: Status 404 returned error can't find the container with id 67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce Mar 10 09:55:30 crc kubenswrapper[4883]: I0310 09:55:30.765222 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerStarted","Data":"67677ee8b72e7307a6d63c64fb27dbf478ad12ab018dcf5dc306b6b5c263a6ce"} Mar 10 09:55:38 crc kubenswrapper[4883]: I0310 09:55:38.079817 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:38 crc kubenswrapper[4883]: E0310 09:55:38.080605 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:55:40 crc kubenswrapper[4883]: I0310 09:55:40.863751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerStarted","Data":"3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1"} Mar 10 09:55:40 crc kubenswrapper[4883]: I0310 09:55:40.883804 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" podStartSLOduration=1.914598512 podStartE2EDuration="11.883784442s" podCreationTimestamp="2026-03-10 09:55:29 +0000 UTC" firstStartedPulling="2026-03-10 09:55:30.244077838 +0000 UTC m=+3116.498975727" lastFinishedPulling="2026-03-10 09:55:40.213263778 +0000 UTC m=+3126.468161657" observedRunningTime="2026-03-10 09:55:40.877090584 +0000 UTC m=+3127.131988473" watchObservedRunningTime="2026-03-10 09:55:40.883784442 +0000 UTC m=+3127.138682330" Mar 10 09:55:52 crc kubenswrapper[4883]: I0310 09:55:52.080209 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:55:52 crc kubenswrapper[4883]: E0310 09:55:52.081362 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.148622 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.151055 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.153698 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.162910 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.164868 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.164909 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.215594 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.318328 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.345291 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"auto-csr-approver-29552276-gr77g\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.472489 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:00 crc kubenswrapper[4883]: I0310 09:56:00.888983 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 09:56:01 crc kubenswrapper[4883]: I0310 09:56:01.061979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerStarted","Data":"3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78"} Mar 10 09:56:03 crc kubenswrapper[4883]: I0310 09:56:03.081335 4883 generic.go:334] "Generic (PLEG): container finished" podID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerID="637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db" exitCode=0 Mar 10 09:56:03 crc kubenswrapper[4883]: I0310 09:56:03.081463 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerDied","Data":"637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db"} Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.392404 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.406081 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") pod \"4c430570-1b7a-4e38-9a8b-f13d69c18882\" (UID: \"4c430570-1b7a-4e38-9a8b-f13d69c18882\") " Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.412882 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq" (OuterVolumeSpecName: "kube-api-access-46zdq") pod "4c430570-1b7a-4e38-9a8b-f13d69c18882" (UID: "4c430570-1b7a-4e38-9a8b-f13d69c18882"). InnerVolumeSpecName "kube-api-access-46zdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:04 crc kubenswrapper[4883]: I0310 09:56:04.510369 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zdq\" (UniqueName: \"kubernetes.io/projected/4c430570-1b7a-4e38-9a8b-f13d69c18882-kube-api-access-46zdq\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.080136 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:05 crc kubenswrapper[4883]: E0310 09:56:05.080452 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097376 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552276-gr77g" event={"ID":"4c430570-1b7a-4e38-9a8b-f13d69c18882","Type":"ContainerDied","Data":"3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78"} Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097411 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552276-gr77g" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.097418 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6dfca9351496f6c9565e98895c334390a1acdd73ab5b64f2c561f681606d78" Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.454544 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:56:05 crc kubenswrapper[4883]: I0310 09:56:05.464178 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552270-wwj5c"] Mar 10 09:56:06 crc kubenswrapper[4883]: I0310 09:56:06.092405 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c376c647-e032-465a-8abc-e8ae35219822" path="/var/lib/kubelet/pods/c376c647-e032-465a-8abc-e8ae35219822/volumes" Mar 10 09:56:06 crc kubenswrapper[4883]: I0310 09:56:06.717124 4883 scope.go:117] "RemoveContainer" containerID="d6974f5cfcc3316424a08bec1cdff0b6d759b4edf301ad2eefc7e6b26b5cc6f9" Mar 10 09:56:10 crc kubenswrapper[4883]: I0310 09:56:10.141887 4883 generic.go:334] "Generic (PLEG): container finished" podID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerID="3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1" exitCode=0 Mar 10 09:56:10 crc kubenswrapper[4883]: I0310 09:56:10.141979 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" event={"ID":"7a396c68-b7c1-4eda-abc2-563cdd15fee3","Type":"ContainerDied","Data":"3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1"} Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.234744 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246130 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") pod \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246212 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") pod \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\" (UID: \"7a396c68-b7c1-4eda-abc2-563cdd15fee3\") " Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.246641 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host" (OuterVolumeSpecName: "host") pod "7a396c68-b7c1-4eda-abc2-563cdd15fee3" (UID: "7a396c68-b7c1-4eda-abc2-563cdd15fee3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.251165 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb" (OuterVolumeSpecName: "kube-api-access-trvfb") pod "7a396c68-b7c1-4eda-abc2-563cdd15fee3" (UID: "7a396c68-b7c1-4eda-abc2-563cdd15fee3"). InnerVolumeSpecName "kube-api-access-trvfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.273656 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.279674 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-rz69b"] Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.348360 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trvfb\" (UniqueName: \"kubernetes.io/projected/7a396c68-b7c1-4eda-abc2-563cdd15fee3-kube-api-access-trvfb\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:11 crc kubenswrapper[4883]: I0310 09:56:11.348394 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a396c68-b7c1-4eda-abc2-563cdd15fee3-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.089146 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" path="/var/lib/kubelet/pods/7a396c68-b7c1-4eda-abc2-563cdd15fee3/volumes" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.162588 4883 scope.go:117] "RemoveContainer" containerID="3d6e4b449f9ccbd94f8a72789781de51d066c6a7bae2debb32621e755120d6e1" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.162733 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-rz69b" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.614669 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:12 crc kubenswrapper[4883]: E0310 09:56:12.615829 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.615851 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: E0310 09:56:12.615894 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.615901 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.616133 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" containerName="oc" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.616159 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a396c68-b7c1-4eda-abc2-563cdd15fee3" containerName="container-00" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.617016 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.618888 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9vgtk"/"default-dockercfg-7b2hr" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.674026 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.674113 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.776657 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.776868 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.777023 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.793524 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"crc-debug-f8p8k\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: I0310 09:56:12.935726 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:12 crc kubenswrapper[4883]: W0310 09:56:12.963262 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8ba7892_8a88_47fc_8a18_2be6dfb4b6ff.slice/crio-56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d WatchSource:0}: Error finding container 56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d: Status 404 returned error can't find the container with id 56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.178498 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" event={"ID":"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff","Type":"ContainerStarted","Data":"532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305"} Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.178861 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" event={"ID":"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff","Type":"ContainerStarted","Data":"56df39cb2589ae984de8d53617c01a58a1c40539c5707b2a5d02a8a6270c572d"} Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.674719 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:13 crc kubenswrapper[4883]: I0310 09:56:13.684866 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-f8p8k"] Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.191292 4883 generic.go:334] "Generic (PLEG): container finished" podID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerID="532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305" exitCode=0 Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.270756 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313056 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") pod \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313178 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") pod \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\" (UID: \"d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff\") " Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313243 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host" (OuterVolumeSpecName: "host") pod "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" (UID: "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.313900 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.319631 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx" (OuterVolumeSpecName: "kube-api-access-tqgnx") pod "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" (UID: "d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff"). InnerVolumeSpecName "kube-api-access-tqgnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.414528 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgnx\" (UniqueName: \"kubernetes.io/projected/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff-kube-api-access-tqgnx\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.846419 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:14 crc kubenswrapper[4883]: E0310 09:56:14.846802 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.846817 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.847040 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" containerName="container-00" Mar 10 09:56:14 crc kubenswrapper[4883]: I0310 09:56:14.847716 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.026313 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.026401 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.128147 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.128261 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.129065 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.144088 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"crc-debug-x2sbs\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.161546 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:15 crc kubenswrapper[4883]: W0310 09:56:15.185566 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb30bf7c_358d_4f5b_a5e0_efdb0a9bc4b0.slice/crio-c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e WatchSource:0}: Error finding container c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e: Status 404 returned error can't find the container with id c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.198551 4883 scope.go:117] "RemoveContainer" containerID="532b2527d72aafd115f99ee9c5bf109ede9886f8114ce58f7fee9187e356c305" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.198667 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-f8p8k" Mar 10 09:56:15 crc kubenswrapper[4883]: I0310 09:56:15.202402 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" event={"ID":"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0","Type":"ContainerStarted","Data":"c2b98c39dccc737efc6ca2be10fbfc53714a5c1f02982eef175dcd5d09bbb73e"} Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.090798 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff" path="/var/lib/kubelet/pods/d8ba7892-8a88-47fc-8a18-2be6dfb4b6ff/volumes" Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.214603 4883 generic.go:334] "Generic (PLEG): container finished" podID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerID="d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd" exitCode=0 Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.214648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" event={"ID":"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0","Type":"ContainerDied","Data":"d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd"} Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.247341 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:16 crc kubenswrapper[4883]: I0310 09:56:16.255889 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/crc-debug-x2sbs"] Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.305174 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.475867 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") pod \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.475935 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") pod \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\" (UID: \"db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0\") " Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.476042 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host" (OuterVolumeSpecName: "host") pod "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" (UID: "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.476991 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.482905 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh" (OuterVolumeSpecName: "kube-api-access-mx8wh") pod "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" (UID: "db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0"). InnerVolumeSpecName "kube-api-access-mx8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:56:17 crc kubenswrapper[4883]: I0310 09:56:17.579798 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx8wh\" (UniqueName: \"kubernetes.io/projected/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0-kube-api-access-mx8wh\") on node \"crc\" DevicePath \"\"" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.089125 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" path="/var/lib/kubelet/pods/db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0/volumes" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.230230 4883 scope.go:117] "RemoveContainer" containerID="d219caba54af1ce5a0a3fb8cf87ed0f4b6c5ef17917a57a19ad0eb0e451f6dfd" Mar 10 09:56:18 crc kubenswrapper[4883]: I0310 09:56:18.230584 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/crc-debug-x2sbs" Mar 10 09:56:19 crc kubenswrapper[4883]: I0310 09:56:19.080274 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:19 crc kubenswrapper[4883]: E0310 09:56:19.080613 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:33 crc kubenswrapper[4883]: I0310 09:56:33.080331 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:33 crc kubenswrapper[4883]: E0310 09:56:33.081032 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.056705 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.286354 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.369183 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.469353 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.632016 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker-log/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.633895 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.781545 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r_de8c98db-31db-4ecd-83f2-c53d4bdd2ddd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.857263 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-central-agent/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.896495 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-notification-agent/0.log" Mar 10 09:56:34 crc kubenswrapper[4883]: I0310 09:56:34.951203 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/proxy-httpd/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.059099 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/sg-core/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.132959 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.149589 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api-log/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.309357 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/probe/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.319239 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/cinder-scheduler/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.442162 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm_07ddb6af-f2c7-46eb-aac4-fe69996caf27/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.584335 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh_269dd9c8-3d75-4892-9f75-c4fe1b9093b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.638220 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.837932 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.858617 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr_2428d4e5-b48e-45ad-9bfb-711c3b1e8471/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:35 crc kubenswrapper[4883]: I0310 09:56:35.901116 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/dnsmasq-dns/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.051340 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-httpd/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.054590 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.236059 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-httpd/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.255814 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.387872 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.529786 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9_7e9f7531-37e1-4284-94ac-cada3d2fc301/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.653454 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon-log/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.716160 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kglh5_361b2613-f26e-45c3-aabe-9a0f115e8e10/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:36 crc kubenswrapper[4883]: I0310 09:56:36.946668 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_39c373dd-952a-4305-82ed-1d047c7a859f/kube-state-metrics/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.005376 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-744f4576f6-kglt9_c6effa97-6f88-4706-98bc-b51af01bd993/keystone-api/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.173261 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-thhsw_eb3b72a2-945a-4719-87c0-ffaf7eb84b52/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.414065 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-httpd/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.709002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-api/0.log" Mar 10 09:56:37 crc kubenswrapper[4883]: I0310 09:56:37.797963 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4_d37d0afe-ad64-4616-b877-bd05deefd038/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.321169 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-log/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.430453 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_19096ebe-3796-4e22-a477-45d3e635a80a/nova-cell0-conductor-conductor/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.460391 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-api/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.608975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_90b06d82-9f07-4c29-9bad-987d2c6d027c/nova-cell1-conductor-conductor/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.719732 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2c5d710c-62fb-4a8c-8a5c-ec6709017c75/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.814348 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-47dxf_af134b73-8c24-4b9e-b15e-48ff4b83ecd4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:38 crc kubenswrapper[4883]: I0310 09:56:38.986190 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-log/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.189825 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_626b3115-ced1-45ea-8401-e2bd7e79a20c/nova-scheduler-scheduler/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.261851 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.470440 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/galera/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.527286 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.668307 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.870841 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.929711 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/galera/0.log" Mar 10 09:56:39 crc kubenswrapper[4883]: I0310 09:56:39.960959 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-metadata/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.082541 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_166b0c95-d44f-41e4-b27a-01e549dfb9d2/openstackclient/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.165239 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lb2z9_6691939e-adb0-420c-bf9e-f4a9b670c83b/ovn-controller/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.338368 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b2z2p_570aed6d-03dc-4ad5-b0e1-c6efc4facabb/openstack-network-exporter/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.435534 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.584035 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.633315 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovs-vswitchd/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.663826 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.795413 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7cqkz_bbcde384-73a5-48c3-a5fb-226d671707cb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.854602 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/openstack-network-exporter/0.log" Mar 10 09:56:40 crc kubenswrapper[4883]: I0310 09:56:40.883586 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/ovn-northd/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.289987 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/openstack-network-exporter/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.293113 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/ovsdbserver-nb/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.389236 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/openstack-network-exporter/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.480645 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/ovsdbserver-sb/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.529452 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-api/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.687948 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-log/0.log" Mar 10 09:56:41 crc kubenswrapper[4883]: I0310 09:56:41.728385 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.022468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/rabbitmq/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.057703 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.064658 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.221700 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.271506 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/rabbitmq/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.277289 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz_0efdf39d-2133-4aaf-9fec-2b50533d3cae/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.478859 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf4n9_d3461a81-abbe-4c3e-88ca-42eff1eeb14e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.530663 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7_4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.690606 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rlqjc_61bb4cc5-1d4f-4439-a00e-4b2e27d4802b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.743806 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5v84_caa69332-97ab-4629-900f-1596af363ba4/ssh-known-hosts-edpm-deployment/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.965925 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-server/0.log" Mar 10 09:56:42 crc kubenswrapper[4883]: I0310 09:56:42.993825 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-httpd/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.109755 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n4vhh_cbe93226-96c7-4854-abdc-4afe54ad7ad5/swift-ring-rebalance/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.201787 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.281275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-reaper/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.348268 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.380489 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.431084 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.538895 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-updater/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.544018 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.565451 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.703466 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-auditor/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.749351 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-expirer/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.756137 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-replicator/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.776319 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-server/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.892178 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-updater/0.log" Mar 10 09:56:43 crc kubenswrapper[4883]: I0310 09:56:43.948584 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/rsync/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.009233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/swift-recon-cron/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.171539 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-blk56_b083d3b3-edb7-4d2f-a7b7-f1275bd83fde/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.201394 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d483d791-15b3-49e7-8095-5660a9d0fdaa/tempest-tests-tempest-tests-runner/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.399948 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4d76dec9-afd2-4850-aacb-c8d60819fc1e/test-operator-logs-container/0.log" Mar 10 09:56:44 crc kubenswrapper[4883]: I0310 09:56:44.465176 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp_20e06399-dd26-4a60-a6b7-261cc4505a92/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 09:56:46 crc kubenswrapper[4883]: I0310 09:56:46.080261 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:56:46 crc kubenswrapper[4883]: E0310 09:56:46.080697 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:56:53 crc kubenswrapper[4883]: I0310 09:56:53.274820 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_52bdcacc-ce19-418b-871c-35482038da29/memcached/0.log" Mar 10 09:57:01 crc kubenswrapper[4883]: I0310 09:57:01.080639 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:01 crc kubenswrapper[4883]: E0310 09:57:01.082860 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:07 crc kubenswrapper[4883]: I0310 09:57:07.875368 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.070097 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.075403 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.084575 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.213751 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.231913 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/extract/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.233727 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.575739 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h2cxw_9a394c48-31ca-4e99-b210-45ae6f67faaa/manager/0.log" Mar 10 09:57:08 crc kubenswrapper[4883]: I0310 09:57:08.851228 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-w9dbp_63474f68-d09d-4822-b650-96a37aead592/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.058347 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-mbxnn_bf027c79-6bdb-4cfb-8c31-d785b80e2231/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.298007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fvwbt_8a4cb5eb-0894-440e-8cfd-448651696a6f/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.674180 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-nzdsk_09a04267-a914-4c55-add8-735a053038d3/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.735817 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-txdwh_884f7bcb-08ef-49f3-912b-ca921e342615/manager/0.log" Mar 10 09:57:09 crc kubenswrapper[4883]: I0310 09:57:09.891302 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-v6p2d_c994e4ad-140c-4655-ad69-e4013406d12e/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.029036 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v5kxw_ad93994a-26d2-4353-80be-456c1311020e/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.120349 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-dgrlb_8b177c77-d85f-4374-b6db-a700719c1282/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.386881 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-kz9sv_ec624ec4-966f-410c-95c7-73be0f9cad27/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.543132 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-snvh5_91415f40-08a2-451b-abe8-38c7b447e66f/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.677636 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rpwdx_760c8dff-c64a-492b-a778-45ef16d197bd/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.785610 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-49gjk_d0e08342-2d1b-42d9-921e-1d948f701a58/manager/0.log" Mar 10 09:57:10 crc kubenswrapper[4883]: I0310 09:57:10.990899 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f9f2px_2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f/manager/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.335716 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-tzrb8_31e7ec33-4b44-48ce-9f01-e483a7668dd6/operator/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.452745 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c4vjl_83852eec-509b-4074-b837-4f00d1d07d05/registry-server/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.573750 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-qnwgj_c13f33e2-dd6a-4ca0-91e7-5489c753e273/manager/0.log" Mar 10 09:57:11 crc kubenswrapper[4883]: I0310 09:57:11.912804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-pppd9_04b3aecb-7cfd-4042-b003-4bc8c339aff8/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.030753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjjsn_475c1190-6d94-431a-943d-4e749ea87d6b/operator/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.180454 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m6wph_1b429bd6-00de-4cc2-8a18-9f58897b6834/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.358902 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_3f4c2998-b51a-4620-b674-60bb0817eb7d/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.442781 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8mpp4_d3d3c04d-7e05-4df2-85c6-394d0bde1a69/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.683820 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-rkjsw_a7216675-a296-4faa-9dd5-d857b15ffa3c/manager/0.log" Mar 10 09:57:12 crc kubenswrapper[4883]: I0310 09:57:12.755804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-9ntl4_969b2d39-fb99-42df-8e6e-3ded5cd292c8/manager/0.log" Mar 10 09:57:14 crc kubenswrapper[4883]: I0310 09:57:14.307311 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-q52nj_ac18771f-5f45-40d8-b275-38e2e1c48ba6/manager/0.log" Mar 10 09:57:15 crc kubenswrapper[4883]: I0310 09:57:15.080375 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:15 crc kubenswrapper[4883]: E0310 09:57:15.080703 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.369924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:23 crc kubenswrapper[4883]: E0310 09:57:23.370784 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.370800 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.371014 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="db30bf7c-358d-4f5b-a5e0-efdb0a9bc4b0" containerName="container-00" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.372309 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.379125 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517070 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517212 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.517496 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619679 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619767 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.619818 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.620151 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.620535 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.637160 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"redhat-marketplace-49cnv\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:23 crc kubenswrapper[4883]: I0310 09:57:23.693172 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.138334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.816759 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" exitCode=0 Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.816924 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a"} Mar 10 09:57:24 crc kubenswrapper[4883]: I0310 09:57:24.817192 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"eef520defbaf457ad9a4386db296afc18afcda0e6337af57c3e015771d645861"} Mar 10 09:57:25 crc kubenswrapper[4883]: I0310 09:57:25.829147 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} Mar 10 09:57:26 crc kubenswrapper[4883]: I0310 09:57:26.837846 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" exitCode=0 Mar 10 09:57:26 crc kubenswrapper[4883]: I0310 09:57:26.838022 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} Mar 10 09:57:27 crc kubenswrapper[4883]: I0310 09:57:27.852244 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerStarted","Data":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} Mar 10 09:57:27 crc kubenswrapper[4883]: I0310 09:57:27.871121 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-49cnv" podStartSLOduration=2.356227816 podStartE2EDuration="4.871103212s" podCreationTimestamp="2026-03-10 09:57:23 +0000 UTC" firstStartedPulling="2026-03-10 09:57:24.818812936 +0000 UTC m=+3231.073710825" lastFinishedPulling="2026-03-10 09:57:27.333688331 +0000 UTC m=+3233.588586221" observedRunningTime="2026-03-10 09:57:27.868138354 +0000 UTC m=+3234.123036242" watchObservedRunningTime="2026-03-10 09:57:27.871103212 +0000 UTC m=+3234.126001101" Mar 10 09:57:28 crc kubenswrapper[4883]: I0310 09:57:28.080342 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:28 crc kubenswrapper[4883]: E0310 09:57:28.080659 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:29 crc kubenswrapper[4883]: I0310 09:57:29.856154 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dlh8b_7ec510e9-f96b-44da-abec-7d49115d0c83/control-plane-machine-set-operator/0.log" Mar 10 09:57:29 crc kubenswrapper[4883]: I0310 09:57:29.990137 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/kube-rbac-proxy/0.log" Mar 10 09:57:30 crc kubenswrapper[4883]: I0310 09:57:30.020679 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/machine-api-operator/0.log" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.693668 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.695199 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.737592 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.937831 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:33 crc kubenswrapper[4883]: I0310 09:57:33.983720 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:35 crc kubenswrapper[4883]: I0310 09:57:35.917702 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-49cnv" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" containerID="cri-o://0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" gracePeriod=2 Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.339460 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390013 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390251 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390294 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") pod \"1aa44e40-18b7-44d8-9359-0b11eaa53417\" (UID: \"1aa44e40-18b7-44d8-9359-0b11eaa53417\") " Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.390834 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities" (OuterVolumeSpecName: "utilities") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.391092 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.395378 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms" (OuterVolumeSpecName: "kube-api-access-jnwms") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "kube-api-access-jnwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.407147 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aa44e40-18b7-44d8-9359-0b11eaa53417" (UID: "1aa44e40-18b7-44d8-9359-0b11eaa53417"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.493000 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa44e40-18b7-44d8-9359-0b11eaa53417-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.493034 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwms\" (UniqueName: \"kubernetes.io/projected/1aa44e40-18b7-44d8-9359-0b11eaa53417-kube-api-access-jnwms\") on node \"crc\" DevicePath \"\"" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929459 4883 generic.go:334] "Generic (PLEG): container finished" podID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" exitCode=0 Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929584 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-49cnv" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929611 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-49cnv" event={"ID":"1aa44e40-18b7-44d8-9359-0b11eaa53417","Type":"ContainerDied","Data":"eef520defbaf457ad9a4386db296afc18afcda0e6337af57c3e015771d645861"} Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.929639 4883 scope.go:117] "RemoveContainer" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.951125 4883 scope.go:117] "RemoveContainer" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.962635 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.968776 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-49cnv"] Mar 10 09:57:36 crc kubenswrapper[4883]: I0310 09:57:36.993424 4883 scope.go:117] "RemoveContainer" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.015907 4883 scope.go:117] "RemoveContainer" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.016360 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": container with ID starting with 0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b not found: ID does not exist" containerID="0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016412 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b"} err="failed to get container status \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": rpc error: code = NotFound desc = could not find container \"0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b\": container with ID starting with 0e51e2bef8baf360c7913fbce268d49e77916e7d3bfd490d076e2bfabab03c9b not found: ID does not exist" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016460 4883 scope.go:117] "RemoveContainer" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.016848 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": container with ID starting with 69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f not found: ID does not exist" containerID="69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016885 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f"} err="failed to get container status \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": rpc error: code = NotFound desc = could not find container \"69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f\": container with ID starting with 69a64b6e7f60ca50df248d2c50dd54c26bb98a4e2c3140c0362c640166ad769f not found: ID does not exist" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.016917 4883 scope.go:117] "RemoveContainer" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: E0310 09:57:37.017339 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": container with ID starting with 1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a not found: ID does not exist" containerID="1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a" Mar 10 09:57:37 crc kubenswrapper[4883]: I0310 09:57:37.017360 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a"} err="failed to get container status \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": rpc error: code = NotFound desc = could not find container \"1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a\": container with ID starting with 1aceb7c80f7ac23da9b31f65876a601f47b06ffa8604b4f2c7611ff53487718a not found: ID does not exist" Mar 10 09:57:38 crc kubenswrapper[4883]: I0310 09:57:38.093657 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" path="/var/lib/kubelet/pods/1aa44e40-18b7-44d8-9359-0b11eaa53417/volumes" Mar 10 09:57:39 crc kubenswrapper[4883]: I0310 09:57:39.079944 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:39 crc kubenswrapper[4883]: E0310 09:57:39.080243 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.190955 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kl2rd_1c0c9250-e9df-4898-bd0e-91919353a3f6/cert-manager-controller/0.log" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.317734 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2g9x_b92cb5d0-214a-49a6-b9b7-f210fef36956/cert-manager-cainjector/0.log" Mar 10 09:57:41 crc kubenswrapper[4883]: I0310 09:57:41.348747 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dfhh4_f33cf1b9-ce0d-41f4-8f36-1b159badc41e/cert-manager-webhook/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.770593 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mr8tf_805fc4e3-bab7-415e-a190-0ceeda5bd8b7/nmstate-console-plugin/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.938847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5lcxd_d9c7e9ee-a0a0-4afe-bd00-872553ca9b32/nmstate-handler/0.log" Mar 10 09:57:51 crc kubenswrapper[4883]: I0310 09:57:51.991056 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/kube-rbac-proxy/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.029244 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/nmstate-metrics/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.080593 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.144791 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-k4v4s_a776287a-5b99-4f43-8d4c-191108392859/nmstate-operator/0.log" Mar 10 09:57:52 crc kubenswrapper[4883]: I0310 09:57:52.232845 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ccbds_10ab1e00-47a1-4f9a-a55a-131935759d8d/nmstate-webhook/0.log" Mar 10 09:57:53 crc kubenswrapper[4883]: I0310 09:57:53.078133 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.164414 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165555 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-content" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165574 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-content" Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165592 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-utilities" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165599 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="extract-utilities" Mar 10 09:58:00 crc kubenswrapper[4883]: E0310 09:58:00.165622 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165630 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.165867 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa44e40-18b7-44d8-9359-0b11eaa53417" containerName="registry-server" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.166589 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.168392 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.169081 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.172658 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.184127 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.279221 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.381954 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.401280 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"auto-csr-approver-29552278-mmnzb\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.484319 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.920023 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 09:58:00 crc kubenswrapper[4883]: I0310 09:58:00.924562 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:58:01 crc kubenswrapper[4883]: I0310 09:58:01.143538 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerStarted","Data":"ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816"} Mar 10 09:58:03 crc kubenswrapper[4883]: I0310 09:58:03.166959 4883 generic.go:334] "Generic (PLEG): container finished" podID="63837d10-3c84-4972-98da-7415e14f2594" containerID="d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825" exitCode=0 Mar 10 09:58:03 crc kubenswrapper[4883]: I0310 09:58:03.167042 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerDied","Data":"d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825"} Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.459851 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.575644 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") pod \"63837d10-3c84-4972-98da-7415e14f2594\" (UID: \"63837d10-3c84-4972-98da-7415e14f2594\") " Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.592982 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm" (OuterVolumeSpecName: "kube-api-access-2j5dm") pod "63837d10-3c84-4972-98da-7415e14f2594" (UID: "63837d10-3c84-4972-98da-7415e14f2594"). InnerVolumeSpecName "kube-api-access-2j5dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:58:04 crc kubenswrapper[4883]: I0310 09:58:04.677869 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j5dm\" (UniqueName: \"kubernetes.io/projected/63837d10-3c84-4972-98da-7415e14f2594-kube-api-access-2j5dm\") on node \"crc\" DevicePath \"\"" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185432 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" event={"ID":"63837d10-3c84-4972-98da-7415e14f2594","Type":"ContainerDied","Data":"ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816"} Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185512 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef31c9fb65c55e053a9af5e0ebcb8a4603e140babca44c4fa31676ab6ca88816" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.185519 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552278-mmnzb" Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.532218 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:58:05 crc kubenswrapper[4883]: I0310 09:58:05.538668 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552272-7b8fg"] Mar 10 09:58:06 crc kubenswrapper[4883]: I0310 09:58:06.088595 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd9f896-b725-4a44-825a-9fd728da26b2" path="/var/lib/kubelet/pods/4fd9f896-b725-4a44-825a-9fd728da26b2/volumes" Mar 10 09:58:06 crc kubenswrapper[4883]: I0310 09:58:06.885894 4883 scope.go:117] "RemoveContainer" containerID="55da525eb21d992e868ae1c36ae9269aafbac403e9bea6b7d9b244d9b58e489c" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.010258 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/kube-rbac-proxy/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.131230 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/controller/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.251964 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.416370 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.429131 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.443088 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.456992 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.628181 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.629129 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.631081 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.664438 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.804521 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.808496 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.815619 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.819570 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/controller/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.950753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr-metrics/0.log" Mar 10 09:58:15 crc kubenswrapper[4883]: I0310 09:58:15.966197 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.027761 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy-frr/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.239233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-shjnr_8e843a56-715a-44fc-9974-8570d49bd9a0/frr-k8s-webhook-server/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.250093 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/reloader/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.459417 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c79cc77cd-s6vgn_5804aa0d-ee19-4fb3-bd39-27c7103571d8/manager/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.644203 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57848ff665-prp4d_cb05036e-52f2-48ab-ba84-f89c4565a0af/webhook-server/0.log" Mar 10 09:58:16 crc kubenswrapper[4883]: I0310 09:58:16.746615 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/kube-rbac-proxy/0.log" Mar 10 09:58:17 crc kubenswrapper[4883]: I0310 09:58:17.229044 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/speaker/0.log" Mar 10 09:58:17 crc kubenswrapper[4883]: I0310 09:58:17.377847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.118740 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.261998 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.264410 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.296467 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.414780 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.430934 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.436405 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/extract/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.595822 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.752210 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.756576 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.769683 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.896975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 09:58:28 crc kubenswrapper[4883]: I0310 09:58:28.932408 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.045571 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.300712 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/registry-server/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.314321 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.317306 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.357647 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.512834 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.528408 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.713389 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.906654 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.915196 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.944922 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:29 crc kubenswrapper[4883]: I0310 09:58:29.960783 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/registry-server/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.128236 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/extract/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.133002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.145297 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.273007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d6jf_849aec1a-3ce6-4153-8e52-4bf0185e29e3/marketplace-operator/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.341087 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.468605 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.468692 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.491308 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.667366 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.674149 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.791164 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/registry-server/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.879985 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.982318 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:30 crc kubenswrapper[4883]: I0310 09:58:30.992975 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.012038 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.180430 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.188102 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 09:58:31 crc kubenswrapper[4883]: I0310 09:58:31.596331 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/registry-server/0.log" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.129803 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:23 crc kubenswrapper[4883]: E0310 09:59:23.131160 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.131179 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.133210 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="63837d10-3c84-4972-98da-7415e14f2594" containerName="oc" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.135150 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.143866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300452 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300529 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.300586 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401714 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401757 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.401802 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.402278 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.402341 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.421378 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"community-operators-qmm8n\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.469201 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:23 crc kubenswrapper[4883]: I0310 09:59:23.995975 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.902855 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" exitCode=0 Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.902907 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174"} Mar 10 09:59:24 crc kubenswrapper[4883]: I0310 09:59:24.903440 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"8c092cbe46ef3bb250fea53ef43ad5c32d25e6d4568f6eb7244f4a5cff9be4ae"} Mar 10 09:59:25 crc kubenswrapper[4883]: I0310 09:59:25.912511 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} Mar 10 09:59:26 crc kubenswrapper[4883]: I0310 09:59:26.941170 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" exitCode=0 Mar 10 09:59:26 crc kubenswrapper[4883]: I0310 09:59:26.941218 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} Mar 10 09:59:27 crc kubenswrapper[4883]: I0310 09:59:27.952971 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerStarted","Data":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} Mar 10 09:59:27 crc kubenswrapper[4883]: I0310 09:59:27.970895 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qmm8n" podStartSLOduration=2.376253646 podStartE2EDuration="4.970877645s" podCreationTimestamp="2026-03-10 09:59:23 +0000 UTC" firstStartedPulling="2026-03-10 09:59:24.90470951 +0000 UTC m=+3351.159607400" lastFinishedPulling="2026-03-10 09:59:27.49933351 +0000 UTC m=+3353.754231399" observedRunningTime="2026-03-10 09:59:27.965835211 +0000 UTC m=+3354.220733099" watchObservedRunningTime="2026-03-10 09:59:27.970877645 +0000 UTC m=+3354.225775534" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.469790 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.471253 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:33 crc kubenswrapper[4883]: I0310 09:59:33.520115 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:34 crc kubenswrapper[4883]: I0310 09:59:34.042548 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:34 crc kubenswrapper[4883]: I0310 09:59:34.091046 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.023064 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qmm8n" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" containerID="cri-o://44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" gracePeriod=2 Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.415090 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.563923 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.564386 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.564440 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") pod \"a1ddb85f-2071-49f6-a977-999227732efc\" (UID: \"a1ddb85f-2071-49f6-a977-999227732efc\") " Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.565959 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities" (OuterVolumeSpecName: "utilities") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.569115 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4" (OuterVolumeSpecName: "kube-api-access-4wtc4") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "kube-api-access-4wtc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.615757 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1ddb85f-2071-49f6-a977-999227732efc" (UID: "a1ddb85f-2071-49f6-a977-999227732efc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666859 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtc4\" (UniqueName: \"kubernetes.io/projected/a1ddb85f-2071-49f6-a977-999227732efc-kube-api-access-4wtc4\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666893 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:36 crc kubenswrapper[4883]: I0310 09:59:36.666903 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ddb85f-2071-49f6-a977-999227732efc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.032938 4883 generic.go:334] "Generic (PLEG): container finished" podID="a1ddb85f-2071-49f6-a977-999227732efc" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" exitCode=0 Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.032989 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033019 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qmm8n" event={"ID":"a1ddb85f-2071-49f6-a977-999227732efc","Type":"ContainerDied","Data":"8c092cbe46ef3bb250fea53ef43ad5c32d25e6d4568f6eb7244f4a5cff9be4ae"} Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033036 4883 scope.go:117] "RemoveContainer" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.033170 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qmm8n" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.060966 4883 scope.go:117] "RemoveContainer" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.065158 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.075492 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qmm8n"] Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.091980 4883 scope.go:117] "RemoveContainer" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119417 4883 scope.go:117] "RemoveContainer" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.119895 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": container with ID starting with 44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9 not found: ID does not exist" containerID="44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119945 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9"} err="failed to get container status \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": rpc error: code = NotFound desc = could not find container \"44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9\": container with ID starting with 44ad291e464869482648b6f31df98a681b2986d149f32663f136ac2f1bfa2ad9 not found: ID does not exist" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.119979 4883 scope.go:117] "RemoveContainer" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.120511 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": container with ID starting with 2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154 not found: ID does not exist" containerID="2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.120587 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154"} err="failed to get container status \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": rpc error: code = NotFound desc = could not find container \"2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154\": container with ID starting with 2f3af6e06bd51467b3e19f6827087d2077a16213173c3fddc777fc2d08d6d154 not found: ID does not exist" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.120638 4883 scope.go:117] "RemoveContainer" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: E0310 09:59:37.121073 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": container with ID starting with 95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174 not found: ID does not exist" containerID="95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174" Mar 10 09:59:37 crc kubenswrapper[4883]: I0310 09:59:37.121110 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174"} err="failed to get container status \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": rpc error: code = NotFound desc = could not find container \"95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174\": container with ID starting with 95d8577ae959bfadcb69c2fb2a185cec8e268465280f0cf09a56dd740795f174 not found: ID does not exist" Mar 10 09:59:38 crc kubenswrapper[4883]: I0310 09:59:38.093298 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ddb85f-2071-49f6-a977-999227732efc" path="/var/lib/kubelet/pods/a1ddb85f-2071-49f6-a977-999227732efc/volumes" Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.231082 4883 generic.go:334] "Generic (PLEG): container finished" podID="e940d297-b038-48e9-a4bd-777df629de28" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" exitCode=0 Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.231148 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" event={"ID":"e940d297-b038-48e9-a4bd-777df629de28","Type":"ContainerDied","Data":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.232590 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 09:59:59 crc kubenswrapper[4883]: I0310 09:59:59.334804 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9vgtk_must-gather-bkkgz_e940d297-b038-48e9-a4bd-777df629de28/gather/0.log" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.142512 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143448 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-utilities" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143572 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-utilities" Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-content" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143709 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="extract-content" Mar 10 10:00:00 crc kubenswrapper[4883]: E0310 10:00:00.143799 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.143847 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.144118 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ddb85f-2071-49f6-a977-999227732efc" containerName="registry-server" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.144892 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.146927 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.147220 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.147309 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.152200 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.153201 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.154505 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.154712 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.162334 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.173418 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217553 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217588 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217712 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.217736 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.318973 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319122 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319169 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.319273 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.320254 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.339538 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.342039 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"auto-csr-approver-29552280-8d8wl\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.342936 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"collect-profiles-29552280-glfq4\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.469884 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.480323 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.940542 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4"] Mar 10 10:00:00 crc kubenswrapper[4883]: I0310 10:00:00.948866 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.250046 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerStarted","Data":"d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521"} Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.250117 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerStarted","Data":"f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95"} Mar 10 10:00:01 crc kubenswrapper[4883]: I0310 10:00:01.251396 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerStarted","Data":"fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6"} Mar 10 10:00:02 crc kubenswrapper[4883]: I0310 10:00:02.262437 4883 generic.go:334] "Generic (PLEG): container finished" podID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerID="d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521" exitCode=0 Mar 10 10:00:02 crc kubenswrapper[4883]: I0310 10:00:02.262532 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerDied","Data":"d68b6023f7937a4e68bf3efce21061e98f0d818f1ee1d42d9743a6027a1ab521"} Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.578014 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692438 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692522 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.692626 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") pod \"fa19fde8-da7f-4160-8ac1-79860fb75e66\" (UID: \"fa19fde8-da7f-4160-8ac1-79860fb75e66\") " Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.693510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.698768 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.698983 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf" (OuterVolumeSpecName: "kube-api-access-4j2qf") pod "fa19fde8-da7f-4160-8ac1-79860fb75e66" (UID: "fa19fde8-da7f-4160-8ac1-79860fb75e66"). InnerVolumeSpecName "kube-api-access-4j2qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794531 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j2qf\" (UniqueName: \"kubernetes.io/projected/fa19fde8-da7f-4160-8ac1-79860fb75e66-kube-api-access-4j2qf\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794554 4883 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa19fde8-da7f-4160-8ac1-79860fb75e66-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:03 crc kubenswrapper[4883]: I0310 10:00:03.794564 4883 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa19fde8-da7f-4160-8ac1-79860fb75e66-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.282987 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" event={"ID":"fa19fde8-da7f-4160-8ac1-79860fb75e66","Type":"ContainerDied","Data":"f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95"} Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.283033 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09ebad2386e0c657de76dfbc8e429c470df668ddc5e2663f504ce9691501d95" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.283049 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552280-glfq4" Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.342159 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 10:00:04 crc kubenswrapper[4883]: I0310 10:00:04.348096 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-9wd67"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.092364 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16ab2a6-c8ca-4487-b42f-381f61d18ba0" path="/var/lib/kubelet/pods/e16ab2a6-c8ca-4487-b42f-381f61d18ba0/volumes" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.287551 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.287848 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" containerID="cri-o://765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" gracePeriod=2 Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.297336 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9vgtk/must-gather-bkkgz"] Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.683614 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9vgtk_must-gather-bkkgz_e940d297-b038-48e9-a4bd-777df629de28/copy/0.log" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.684081 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.855413 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") pod \"e940d297-b038-48e9-a4bd-777df629de28\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.855747 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") pod \"e940d297-b038-48e9-a4bd-777df629de28\" (UID: \"e940d297-b038-48e9-a4bd-777df629de28\") " Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.860368 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952" (OuterVolumeSpecName: "kube-api-access-gm952") pod "e940d297-b038-48e9-a4bd-777df629de28" (UID: "e940d297-b038-48e9-a4bd-777df629de28"). InnerVolumeSpecName "kube-api-access-gm952". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.958659 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm952\" (UniqueName: \"kubernetes.io/projected/e940d297-b038-48e9-a4bd-777df629de28-kube-api-access-gm952\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:06 crc kubenswrapper[4883]: I0310 10:00:06.982706 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e940d297-b038-48e9-a4bd-777df629de28" (UID: "e940d297-b038-48e9-a4bd-777df629de28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.043315 4883 scope.go:117] "RemoveContainer" containerID="1cb9093a5dc1551f7fb85ef25abe36d1ab423453387c5dcc49644004e7492e56" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.061564 4883 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e940d297-b038-48e9-a4bd-777df629de28-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314539 4883 generic.go:334] "Generic (PLEG): container finished" podID="e940d297-b038-48e9-a4bd-777df629de28" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" exitCode=143 Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314650 4883 scope.go:117] "RemoveContainer" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.314643 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9vgtk/must-gather-bkkgz" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.339628 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.409697 4883 scope.go:117] "RemoveContainer" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: E0310 10:00:07.410145 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": container with ID starting with 765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615 not found: ID does not exist" containerID="765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410178 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615"} err="failed to get container status \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": rpc error: code = NotFound desc = could not find container \"765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615\": container with ID starting with 765227e1929693d8af7335feeb24f2a87b2243a2a9abe1d52960a0c3a3479615 not found: ID does not exist" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410201 4883 scope.go:117] "RemoveContainer" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: E0310 10:00:07.410606 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": container with ID starting with 5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6 not found: ID does not exist" containerID="5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6" Mar 10 10:00:07 crc kubenswrapper[4883]: I0310 10:00:07.410652 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6"} err="failed to get container status \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": rpc error: code = NotFound desc = could not find container \"5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6\": container with ID starting with 5388eb6d68eb12452b0afea411ee22d79358cc2bd67222711c7aaf7db3beeff6 not found: ID does not exist" Mar 10 10:00:08 crc kubenswrapper[4883]: I0310 10:00:08.095972 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e940d297-b038-48e9-a4bd-777df629de28" path="/var/lib/kubelet/pods/e940d297-b038-48e9-a4bd-777df629de28/volumes" Mar 10 10:00:15 crc kubenswrapper[4883]: I0310 10:00:15.407668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerStarted","Data":"b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a"} Mar 10 10:00:15 crc kubenswrapper[4883]: I0310 10:00:15.431281 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" podStartSLOduration=1.20029418 podStartE2EDuration="15.43126581s" podCreationTimestamp="2026-03-10 10:00:00 +0000 UTC" firstStartedPulling="2026-03-10 10:00:00.945128374 +0000 UTC m=+3387.200026263" lastFinishedPulling="2026-03-10 10:00:15.176100004 +0000 UTC m=+3401.430997893" observedRunningTime="2026-03-10 10:00:15.421283455 +0000 UTC m=+3401.676181345" watchObservedRunningTime="2026-03-10 10:00:15.43126581 +0000 UTC m=+3401.686163699" Mar 10 10:00:16 crc kubenswrapper[4883]: I0310 10:00:16.419994 4883 generic.go:334] "Generic (PLEG): container finished" podID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerID="b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a" exitCode=0 Mar 10 10:00:16 crc kubenswrapper[4883]: I0310 10:00:16.420067 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerDied","Data":"b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a"} Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.449320 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.449795 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.758834 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.892742 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") pod \"f7b2dc78-bc43-4cf8-a946-509772bb2522\" (UID: \"f7b2dc78-bc43-4cf8-a946-509772bb2522\") " Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.900510 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8" (OuterVolumeSpecName: "kube-api-access-jnwf8") pod "f7b2dc78-bc43-4cf8-a946-509772bb2522" (UID: "f7b2dc78-bc43-4cf8-a946-509772bb2522"). InnerVolumeSpecName "kube-api-access-jnwf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:00:17 crc kubenswrapper[4883]: I0310 10:00:17.996828 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwf8\" (UniqueName: \"kubernetes.io/projected/f7b2dc78-bc43-4cf8-a946-509772bb2522-kube-api-access-jnwf8\") on node \"crc\" DevicePath \"\"" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446826 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" event={"ID":"f7b2dc78-bc43-4cf8-a946-509772bb2522","Type":"ContainerDied","Data":"fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6"} Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446908 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8b19a18b53bfac4eac10e0cd5189c015cb4d934110a7e3faf39d5f432028c6" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.446907 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552280-8d8wl" Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.487102 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 10:00:18 crc kubenswrapper[4883]: I0310 10:00:18.497518 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552274-p9wph"] Mar 10 10:00:20 crc kubenswrapper[4883]: I0310 10:00:20.089794 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05d7064d-bd5b-4775-ab8f-2d5780f76440" path="/var/lib/kubelet/pods/05d7064d-bd5b-4775-ab8f-2d5780f76440/volumes" Mar 10 10:00:47 crc kubenswrapper[4883]: I0310 10:00:47.449096 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:00:47 crc kubenswrapper[4883]: I0310 10:00:47.449709 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.148591 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149438 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149451 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149463 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149489 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149503 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149509 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: E0310 10:01:00.149535 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149540 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149704 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa19fde8-da7f-4160-8ac1-79860fb75e66" containerName="collect-profiles" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149712 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="copy" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149728 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="e940d297-b038-48e9-a4bd-777df629de28" containerName="gather" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.149742 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" containerName="oc" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.150269 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.160291 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.258961 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259544 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259726 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.259831 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362544 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362667 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362850 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.362944 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.368343 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.368940 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.369844 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.377452 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"keystone-cron-29552281-kbhqs\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.466098 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:00 crc kubenswrapper[4883]: I0310 10:01:00.955517 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552281-kbhqs"] Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.828303 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerStarted","Data":"fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019"} Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.828963 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerStarted","Data":"10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0"} Mar 10 10:01:01 crc kubenswrapper[4883]: I0310 10:01:01.852681 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552281-kbhqs" podStartSLOduration=1.8526607130000001 podStartE2EDuration="1.852660713s" podCreationTimestamp="2026-03-10 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:01:01.850869327 +0000 UTC m=+3448.105767216" watchObservedRunningTime="2026-03-10 10:01:01.852660713 +0000 UTC m=+3448.107558602" Mar 10 10:01:03 crc kubenswrapper[4883]: I0310 10:01:03.852417 4883 generic.go:334] "Generic (PLEG): container finished" podID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerID="fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019" exitCode=0 Mar 10 10:01:03 crc kubenswrapper[4883]: I0310 10:01:03.852771 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerDied","Data":"fef295dc078c82d1da33ddf95186c154f12a13384b7e5473baf8c10544a1d019"} Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.181703 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.267817 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.267955 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.268142 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.268217 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") pod \"845ef92d-2dae-49c8-823f-9e3fe2735d79\" (UID: \"845ef92d-2dae-49c8-823f-9e3fe2735d79\") " Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.274050 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf" (OuterVolumeSpecName: "kube-api-access-fsmtf") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "kube-api-access-fsmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.276272 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.292670 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.307406 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data" (OuterVolumeSpecName: "config-data") pod "845ef92d-2dae-49c8-823f-9e3fe2735d79" (UID: "845ef92d-2dae-49c8-823f-9e3fe2735d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370627 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsmtf\" (UniqueName: \"kubernetes.io/projected/845ef92d-2dae-49c8-823f-9e3fe2735d79-kube-api-access-fsmtf\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370663 4883 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370674 4883 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.370683 4883 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845ef92d-2dae-49c8-823f-9e3fe2735d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873387 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552281-kbhqs" event={"ID":"845ef92d-2dae-49c8-823f-9e3fe2735d79","Type":"ContainerDied","Data":"10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0"} Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873448 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10bfdfe3a827e673cb657120f380660eb88dc656044d982f500660c075167ae0" Mar 10 10:01:05 crc kubenswrapper[4883]: I0310 10:01:05.873628 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552281-kbhqs" Mar 10 10:01:07 crc kubenswrapper[4883]: I0310 10:01:07.104861 4883 scope.go:117] "RemoveContainer" containerID="37b94d707e9a0d88465d35e2d3c44d0202d4cba279ab4acfb2218748019ab99d" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449070 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449829 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.449894 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.451081 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.451159 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" gracePeriod=600 Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975311 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" exitCode=0 Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975391 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3"} Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975721 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} Mar 10 10:01:17 crc kubenswrapper[4883]: I0310 10:01:17.975752 4883 scope.go:117] "RemoveContainer" containerID="0f47dfe612330033a0a0ceebe46434093bf194f5248ff3d9b94ad28bd1bcb79f" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.143324 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:00 crc kubenswrapper[4883]: E0310 10:02:00.144268 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.144283 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.144510 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="845ef92d-2dae-49c8-823f-9e3fe2735d79" containerName="keystone-cron" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.145126 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.146522 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.147345 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.147542 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.150549 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.320823 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.423788 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.444236 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"auto-csr-approver-29552282-pdd7d\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.460368 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:00 crc kubenswrapper[4883]: I0310 10:02:00.849561 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:02:01 crc kubenswrapper[4883]: I0310 10:02:01.366326 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerStarted","Data":"3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231"} Mar 10 10:02:02 crc kubenswrapper[4883]: I0310 10:02:02.377562 4883 generic.go:334] "Generic (PLEG): container finished" podID="38ca7076-03d6-4598-a451-cf485909b9fc" containerID="a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086" exitCode=0 Mar 10 10:02:02 crc kubenswrapper[4883]: I0310 10:02:02.377666 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerDied","Data":"a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086"} Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.688513 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.794373 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") pod \"38ca7076-03d6-4598-a451-cf485909b9fc\" (UID: \"38ca7076-03d6-4598-a451-cf485909b9fc\") " Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.799789 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl" (OuterVolumeSpecName: "kube-api-access-wjvdl") pod "38ca7076-03d6-4598-a451-cf485909b9fc" (UID: "38ca7076-03d6-4598-a451-cf485909b9fc"). InnerVolumeSpecName "kube-api-access-wjvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:03 crc kubenswrapper[4883]: I0310 10:02:03.898284 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/38ca7076-03d6-4598-a451-cf485909b9fc-kube-api-access-wjvdl\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.397889 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" event={"ID":"38ca7076-03d6-4598-a451-cf485909b9fc","Type":"ContainerDied","Data":"3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231"} Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.398264 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c7b236e6abbfc9788ca950dcd363c22954f75dcb93bfc44c7bf284dbb489231" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.397937 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552282-pdd7d" Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.755576 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 10:02:04 crc kubenswrapper[4883]: I0310 10:02:04.761449 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552276-gr77g"] Mar 10 10:02:06 crc kubenswrapper[4883]: I0310 10:02:06.088540 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c430570-1b7a-4e38-9a8b-f13d69c18882" path="/var/lib/kubelet/pods/4c430570-1b7a-4e38-9a8b-f13d69c18882/volumes" Mar 10 10:02:07 crc kubenswrapper[4883]: I0310 10:02:07.193050 4883 scope.go:117] "RemoveContainer" containerID="637bb0552a72f7592feca76119ba2d1ac02ce406f7badd427582618ad5b1a1db" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.860215 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:26 crc kubenswrapper[4883]: E0310 10:02:26.861242 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.861257 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.861459 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" containerName="oc" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.862811 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.882491 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974612 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974886 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:26 crc kubenswrapper[4883]: I0310 10:02:26.974989 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075555 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075651 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.075719 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.076095 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.076307 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.094814 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"redhat-operators-bvrxx\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.183612 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:27 crc kubenswrapper[4883]: I0310 10:02:27.608406 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:27 crc kubenswrapper[4883]: W0310 10:02:27.612356 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e WatchSource:0}: Error finding container 2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e: Status 404 returned error can't find the container with id 2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e Mar 10 10:02:27 crc kubenswrapper[4883]: E0310 10:02:27.972255 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-conmon-3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aacda7b_a599_43b2_9c44_920593c90e36.slice/crio-3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608464 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" exitCode=0 Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608598 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a"} Mar 10 10:02:28 crc kubenswrapper[4883]: I0310 10:02:28.608919 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e"} Mar 10 10:02:29 crc kubenswrapper[4883]: I0310 10:02:29.617524 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} Mar 10 10:02:30 crc kubenswrapper[4883]: I0310 10:02:30.629646 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" exitCode=0 Mar 10 10:02:30 crc kubenswrapper[4883]: I0310 10:02:30.629773 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} Mar 10 10:02:31 crc kubenswrapper[4883]: I0310 10:02:31.639076 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerStarted","Data":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} Mar 10 10:02:31 crc kubenswrapper[4883]: I0310 10:02:31.655034 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvrxx" podStartSLOduration=3.141951653 podStartE2EDuration="5.655014617s" podCreationTimestamp="2026-03-10 10:02:26 +0000 UTC" firstStartedPulling="2026-03-10 10:02:28.611220243 +0000 UTC m=+3534.866118133" lastFinishedPulling="2026-03-10 10:02:31.124283208 +0000 UTC m=+3537.379181097" observedRunningTime="2026-03-10 10:02:31.651842359 +0000 UTC m=+3537.906740248" watchObservedRunningTime="2026-03-10 10:02:31.655014617 +0000 UTC m=+3537.909912506" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.417731 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.419815 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.427261 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8cw7n"/"openshift-service-ca.crt" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.427510 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8cw7n"/"kube-root-ca.crt" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.451962 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.452159 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.462123 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555124 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555502 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.555958 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.574967 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"must-gather-29w7p\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:33 crc kubenswrapper[4883]: I0310 10:02:33.738962 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.239519 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:02:34 crc kubenswrapper[4883]: W0310 10:02:34.244108 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b00153_6497_4507_8247_81caa30a91bc.slice/crio-a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974 WatchSource:0}: Error finding container a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974: Status 404 returned error can't find the container with id a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974 Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.669751 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} Mar 10 10:02:34 crc kubenswrapper[4883]: I0310 10:02:34.670195 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"a0c91bdfe040142ab7c035e1e9087020190db2dfd57aff285cbc0fe251003974"} Mar 10 10:02:35 crc kubenswrapper[4883]: I0310 10:02:35.679291 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerStarted","Data":"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183"} Mar 10 10:02:35 crc kubenswrapper[4883]: I0310 10:02:35.693067 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8cw7n/must-gather-29w7p" podStartSLOduration=2.693048208 podStartE2EDuration="2.693048208s" podCreationTimestamp="2026-03-10 10:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:02:35.692257349 +0000 UTC m=+3541.947155228" watchObservedRunningTime="2026-03-10 10:02:35.693048208 +0000 UTC m=+3541.947946097" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.184056 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.184407 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.227753 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.582682 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.584053 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.586452 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8cw7n"/"default-dockercfg-68b9q" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.655571 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.655987 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.731766 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.757976 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.758161 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.758286 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.776914 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"crc-debug-9f7pq\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.778637 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:37 crc kubenswrapper[4883]: I0310 10:02:37.898827 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.704469 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerStarted","Data":"3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366"} Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.705038 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerStarted","Data":"cb0e60bc99c835c81493440e0b3625b06eace07b336b1d7ae7500b2ea5926dfa"} Mar 10 10:02:38 crc kubenswrapper[4883]: I0310 10:02:38.723441 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" podStartSLOduration=1.7233978410000002 podStartE2EDuration="1.723397841s" podCreationTimestamp="2026-03-10 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 10:02:38.714252149 +0000 UTC m=+3544.969150038" watchObservedRunningTime="2026-03-10 10:02:38.723397841 +0000 UTC m=+3544.978295729" Mar 10 10:02:39 crc kubenswrapper[4883]: I0310 10:02:39.714631 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvrxx" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" containerID="cri-o://90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" gracePeriod=2 Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.154862 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210247 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210468 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.210585 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") pod \"3aacda7b-a599-43b2-9c44-920593c90e36\" (UID: \"3aacda7b-a599-43b2-9c44-920593c90e36\") " Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.211771 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities" (OuterVolumeSpecName: "utilities") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.220594 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p" (OuterVolumeSpecName: "kube-api-access-jf72p") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "kube-api-access-jf72p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.313524 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf72p\" (UniqueName: \"kubernetes.io/projected/3aacda7b-a599-43b2-9c44-920593c90e36-kube-api-access-jf72p\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.313561 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723562 4883 generic.go:334] "Generic (PLEG): container finished" podID="3aacda7b-a599-43b2-9c44-920593c90e36" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" exitCode=0 Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723625 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvrxx" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723645 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.723992 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvrxx" event={"ID":"3aacda7b-a599-43b2-9c44-920593c90e36","Type":"ContainerDied","Data":"2ea9a53065218527a1d073b08744fda8c491f3eb86123e90ad0000b1e2faa81e"} Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.724011 4883 scope.go:117] "RemoveContainer" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.741513 4883 scope.go:117] "RemoveContainer" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.761176 4883 scope.go:117] "RemoveContainer" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.808752 4883 scope.go:117] "RemoveContainer" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.809170 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": container with ID starting with 90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74 not found: ID does not exist" containerID="90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809216 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74"} err="failed to get container status \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": rpc error: code = NotFound desc = could not find container \"90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74\": container with ID starting with 90b65c31b6df7ac300db038977758f233046c240178382f675cb0e9e74271a74 not found: ID does not exist" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809246 4883 scope.go:117] "RemoveContainer" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.809865 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": container with ID starting with 4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422 not found: ID does not exist" containerID="4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809899 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422"} err="failed to get container status \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": rpc error: code = NotFound desc = could not find container \"4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422\": container with ID starting with 4f8ae1d9ebb01026c7999f4b86e95909f4ee4e40d925ce536e81f28bfaec4422 not found: ID does not exist" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.809924 4883 scope.go:117] "RemoveContainer" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: E0310 10:02:40.810234 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": container with ID starting with 3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a not found: ID does not exist" containerID="3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a" Mar 10 10:02:40 crc kubenswrapper[4883]: I0310 10:02:40.810291 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a"} err="failed to get container status \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": rpc error: code = NotFound desc = could not find container \"3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a\": container with ID starting with 3c4cdcfae23ee05703f64be036d98e1f05c34813efc50a86b23632ccc3a1804a not found: ID does not exist" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.565786 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aacda7b-a599-43b2-9c44-920593c90e36" (UID: "3aacda7b-a599-43b2-9c44-920593c90e36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.643327 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aacda7b-a599-43b2-9c44-920593c90e36-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.654187 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:41 crc kubenswrapper[4883]: I0310 10:02:41.661346 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvrxx"] Mar 10 10:02:42 crc kubenswrapper[4883]: I0310 10:02:42.090176 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" path="/var/lib/kubelet/pods/3aacda7b-a599-43b2-9c44-920593c90e36/volumes" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.394924 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396700 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-content" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396729 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-content" Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396762 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-utilities" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396768 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="extract-utilities" Mar 10 10:02:59 crc kubenswrapper[4883]: E0310 10:02:59.396783 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396788 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.396976 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aacda7b-a599-43b2-9c44-920593c90e36" containerName="registry-server" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.398280 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.410055 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523502 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523581 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.523678 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.624866 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625077 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625219 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625293 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.625680 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.641795 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"certified-operators-mhscc\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:02:59 crc kubenswrapper[4883]: I0310 10:02:59.730694 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.168118 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.922789 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" exitCode=0 Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.923127 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2"} Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.923193 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"0efd33f1cd5c5db073f800f5a2c225b5dab7634b7633268d907da532c5609709"} Mar 10 10:03:00 crc kubenswrapper[4883]: I0310 10:03:00.926837 4883 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 10:03:01 crc kubenswrapper[4883]: I0310 10:03:01.932423 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} Mar 10 10:03:02 crc kubenswrapper[4883]: I0310 10:03:02.943450 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" exitCode=0 Mar 10 10:03:02 crc kubenswrapper[4883]: I0310 10:03:02.943668 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} Mar 10 10:03:03 crc kubenswrapper[4883]: I0310 10:03:03.969646 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerStarted","Data":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} Mar 10 10:03:03 crc kubenswrapper[4883]: I0310 10:03:03.991635 4883 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mhscc" podStartSLOduration=2.448060207 podStartE2EDuration="4.991618148s" podCreationTimestamp="2026-03-10 10:02:59 +0000 UTC" firstStartedPulling="2026-03-10 10:03:00.926618269 +0000 UTC m=+3567.181516159" lastFinishedPulling="2026-03-10 10:03:03.470176211 +0000 UTC m=+3569.725074100" observedRunningTime="2026-03-10 10:03:03.983886309 +0000 UTC m=+3570.238784199" watchObservedRunningTime="2026-03-10 10:03:03.991618148 +0000 UTC m=+3570.246516037" Mar 10 10:03:04 crc kubenswrapper[4883]: I0310 10:03:04.977973 4883 generic.go:334] "Generic (PLEG): container finished" podID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerID="3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366" exitCode=0 Mar 10 10:03:04 crc kubenswrapper[4883]: I0310 10:03:04.978052 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" event={"ID":"71aa2959-c8ee-46b8-bd2b-654620fbd99a","Type":"ContainerDied","Data":"3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366"} Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.073664 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.108057 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.113520 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-9f7pq"] Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162466 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") pod \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162533 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host" (OuterVolumeSpecName: "host") pod "71aa2959-c8ee-46b8-bd2b-654620fbd99a" (UID: "71aa2959-c8ee-46b8-bd2b-654620fbd99a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.162779 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") pod \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\" (UID: \"71aa2959-c8ee-46b8-bd2b-654620fbd99a\") " Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.163199 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71aa2959-c8ee-46b8-bd2b-654620fbd99a-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.167516 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz" (OuterVolumeSpecName: "kube-api-access-62xkz") pod "71aa2959-c8ee-46b8-bd2b-654620fbd99a" (UID: "71aa2959-c8ee-46b8-bd2b-654620fbd99a"). InnerVolumeSpecName "kube-api-access-62xkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.265860 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62xkz\" (UniqueName: \"kubernetes.io/projected/71aa2959-c8ee-46b8-bd2b-654620fbd99a-kube-api-access-62xkz\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.995648 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0e60bc99c835c81493440e0b3625b06eace07b336b1d7ae7500b2ea5926dfa" Mar 10 10:03:06 crc kubenswrapper[4883]: I0310 10:03:06.995732 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-9f7pq" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304001 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:07 crc kubenswrapper[4883]: E0310 10:03:07.304732 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304750 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.304933 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" containerName="container-00" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.305597 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.307202 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8cw7n"/"default-dockercfg-68b9q" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.389090 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.389259 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490356 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490443 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.490631 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.509379 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"crc-debug-wsv4n\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: I0310 10:03:07.661224 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:07 crc kubenswrapper[4883]: W0310 10:03:07.693618 4883 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf05e8636_7b95_4487_bd28_96cb3159b18e.slice/crio-da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce WatchSource:0}: Error finding container da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce: Status 404 returned error can't find the container with id da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016241 4883 generic.go:334] "Generic (PLEG): container finished" podID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerID="5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271" exitCode=0 Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016467 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" event={"ID":"f05e8636-7b95-4487-bd28-96cb3159b18e","Type":"ContainerDied","Data":"5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271"} Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.016506 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" event={"ID":"f05e8636-7b95-4487-bd28-96cb3159b18e","Type":"ContainerStarted","Data":"da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce"} Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.088840 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71aa2959-c8ee-46b8-bd2b-654620fbd99a" path="/var/lib/kubelet/pods/71aa2959-c8ee-46b8-bd2b-654620fbd99a/volumes" Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.445046 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:08 crc kubenswrapper[4883]: I0310 10:03:08.452607 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-wsv4n"] Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.107937 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228148 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") pod \"f05e8636-7b95-4487-bd28-96cb3159b18e\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228456 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") pod \"f05e8636-7b95-4487-bd28-96cb3159b18e\" (UID: \"f05e8636-7b95-4487-bd28-96cb3159b18e\") " Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.228276 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host" (OuterVolumeSpecName: "host") pod "f05e8636-7b95-4487-bd28-96cb3159b18e" (UID: "f05e8636-7b95-4487-bd28-96cb3159b18e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.229567 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f05e8636-7b95-4487-bd28-96cb3159b18e-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.234026 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n" (OuterVolumeSpecName: "kube-api-access-cnm4n") pod "f05e8636-7b95-4487-bd28-96cb3159b18e" (UID: "f05e8636-7b95-4487-bd28-96cb3159b18e"). InnerVolumeSpecName "kube-api-access-cnm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.332417 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnm4n\" (UniqueName: \"kubernetes.io/projected/f05e8636-7b95-4487-bd28-96cb3159b18e-kube-api-access-cnm4n\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.616992 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:09 crc kubenswrapper[4883]: E0310 10:03:09.617654 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.617670 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.617859 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" containerName="container-00" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.618454 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.731639 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.731698 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.741385 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.741918 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.768060 4883 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843516 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843657 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.843749 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.859646 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"crc-debug-q6vmd\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:09 crc kubenswrapper[4883]: I0310 10:03:09.931770 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.034170 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da59adcc4e4d5489a29e754cdb9e1c48752eb1c5bb76d4b10477711918db40ce" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.034183 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-wsv4n" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.035651 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" event={"ID":"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5","Type":"ContainerStarted","Data":"d795a2b4d8eabba9d1a1bb4399248590bb62fc636940c89f84c73983260a5143"} Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.088814 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05e8636-7b95-4487-bd28-96cb3159b18e" path="/var/lib/kubelet/pods/f05e8636-7b95-4487-bd28-96cb3159b18e/volumes" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.089550 4883 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:10 crc kubenswrapper[4883]: I0310 10:03:10.152754 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.044056 4883 generic.go:334] "Generic (PLEG): container finished" podID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerID="0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059" exitCode=0 Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.044112 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" event={"ID":"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5","Type":"ContainerDied","Data":"0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059"} Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.076205 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:11 crc kubenswrapper[4883]: I0310 10:03:11.091894 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/crc-debug-q6vmd"] Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.050586 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mhscc" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" containerID="cri-o://4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" gracePeriod=2 Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.231589 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293040 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") pod \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293182 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") pod \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\" (UID: \"5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293306 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host" (OuterVolumeSpecName: "host") pod "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" (UID: "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.293772 4883 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-host\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.299577 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf" (OuterVolumeSpecName: "kube-api-access-49rqf") pod "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" (UID: "5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5"). InnerVolumeSpecName "kube-api-access-49rqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.392154 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.396062 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49rqf\" (UniqueName: \"kubernetes.io/projected/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5-kube-api-access-49rqf\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.497959 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.498360 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.498598 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") pod \"7054bc28-c5d1-41b1-a322-ba547740a357\" (UID: \"7054bc28-c5d1-41b1-a322-ba547740a357\") " Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.499630 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities" (OuterVolumeSpecName: "utilities") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.502081 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2" (OuterVolumeSpecName: "kube-api-access-hngb2") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "kube-api-access-hngb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.602403 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngb2\" (UniqueName: \"kubernetes.io/projected/7054bc28-c5d1-41b1-a322-ba547740a357-kube-api-access-hngb2\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.602813 4883 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.789335 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7054bc28-c5d1-41b1-a322-ba547740a357" (UID: "7054bc28-c5d1-41b1-a322-ba547740a357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:03:12 crc kubenswrapper[4883]: I0310 10:03:12.806731 4883 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7054bc28-c5d1-41b1-a322-ba547740a357-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072270 4883 generic.go:334] "Generic (PLEG): container finished" podID="7054bc28-c5d1-41b1-a322-ba547740a357" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" exitCode=0 Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072348 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072407 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mhscc" event={"ID":"7054bc28-c5d1-41b1-a322-ba547740a357","Type":"ContainerDied","Data":"0efd33f1cd5c5db073f800f5a2c225b5dab7634b7633268d907da532c5609709"} Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.072440 4883 scope.go:117] "RemoveContainer" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.074225 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mhscc" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.078372 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/crc-debug-q6vmd" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.101086 4883 scope.go:117] "RemoveContainer" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.110780 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.119585 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mhscc"] Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.119939 4883 scope.go:117] "RemoveContainer" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165143 4883 scope.go:117] "RemoveContainer" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.165709 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": container with ID starting with 4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f not found: ID does not exist" containerID="4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165773 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f"} err="failed to get container status \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": rpc error: code = NotFound desc = could not find container \"4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f\": container with ID starting with 4acfa633a588fbb220b595825947d2139a1445e840a036f32d252450aef7f01f not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.165804 4883 scope.go:117] "RemoveContainer" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.166165 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": container with ID starting with 05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4 not found: ID does not exist" containerID="05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.166207 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4"} err="failed to get container status \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": rpc error: code = NotFound desc = could not find container \"05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4\": container with ID starting with 05d611975db86aef413004bd486d1fb85320f7a06c525152681c0ca13f9118f4 not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.166232 4883 scope.go:117] "RemoveContainer" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: E0310 10:03:13.168797 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": container with ID starting with 4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2 not found: ID does not exist" containerID="4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.168844 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2"} err="failed to get container status \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": rpc error: code = NotFound desc = could not find container \"4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2\": container with ID starting with 4885303c58bafbac6426fd1e327eda71a40e269cdc602078d15c16856513cfa2 not found: ID does not exist" Mar 10 10:03:13 crc kubenswrapper[4883]: I0310 10:03:13.168876 4883 scope.go:117] "RemoveContainer" containerID="0b5b985860f6b61a235555a89c89e510ababeb5a50b400ad61a81c43c2bb3059" Mar 10 10:03:14 crc kubenswrapper[4883]: I0310 10:03:14.094567 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" path="/var/lib/kubelet/pods/5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5/volumes" Mar 10 10:03:14 crc kubenswrapper[4883]: I0310 10:03:14.095535 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" path="/var/lib/kubelet/pods/7054bc28-c5d1-41b1-a322-ba547740a357/volumes" Mar 10 10:03:17 crc kubenswrapper[4883]: I0310 10:03:17.448973 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:03:17 crc kubenswrapper[4883]: I0310 10:03:17.449361 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.518229 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.677579 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b55764b68-l794s_17a46674-c6ec-4128-8285-d71c228d11c8/barbican-api-log/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.692325 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.704195 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f87cd7fb6-jz6ch_dce7df3b-5f31-4732-8d27-8e06dc07824d/barbican-keystone-listener-log/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.868930 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker/0.log" Mar 10 10:03:38 crc kubenswrapper[4883]: I0310 10:03:38.880573 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d48955d69-pvbn8_221490bc-406a-436f-8705-66106ed6bbe0/barbican-worker-log/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.163023 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-68q9r_de8c98db-31db-4ecd-83f2-c53d4bdd2ddd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.205766 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-central-agent/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.276600 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/ceilometer-notification-agent/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.329429 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/proxy-httpd/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.383623 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0819f125-35db-4a0e-8fff-c1d3d3a27ae7/sg-core/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.499077 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.540004 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2caba5f6-d05e-437e-868c-952e8adf3278/cinder-api-log/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.694198 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/probe/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.721231 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a7bae0a1-9bb8-47ba-a161-764cd7406992/cinder-scheduler/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.818633 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-m7ffm_07ddb6af-f2c7-46eb-aac4-fe69996caf27/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:39 crc kubenswrapper[4883]: I0310 10:03:39.927636 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vtbkh_269dd9c8-3d75-4892-9f75-c4fe1b9093b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.011962 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.175105 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/init/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.213972 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-rx8gc_da34e0af-a084-40fb-93ea-471923c49051/dnsmasq-dns/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.228753 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5qjpr_2428d4e5-b48e-45ad-9bfb-711c3b1e8471/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.393697 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-httpd/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.415527 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c354fe8-851f-4cf4-bc13-e06dba0a1cc0/glance-log/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.572247 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-httpd/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.579336 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_676458e7-e4a0-4f1a-b200-0ab75faaddb4/glance-log/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.692022 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon/0.log" Mar 10 10:03:40 crc kubenswrapper[4883]: I0310 10:03:40.826168 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8b2t9_7e9f7531-37e1-4284-94ac-cada3d2fc301/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.031233 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-555c96ddb-t7tcm_ef0598ad-c7ea-4645-b553-7d9028397156/horizon-log/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.037484 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kglh5_361b2613-f26e-45c3-aabe-9a0f115e8e10/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.234682 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-744f4576f6-kglt9_c6effa97-6f88-4706-98bc-b51af01bd993/keystone-api/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.481796 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552281-kbhqs_845ef92d-2dae-49c8-823f-9e3fe2735d79/keystone-cron/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.532394 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_39c373dd-952a-4305-82ed-1d047c7a859f/kube-state-metrics/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.640219 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-thhsw_eb3b72a2-945a-4719-87c0-ffaf7eb84b52/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:41 crc kubenswrapper[4883]: I0310 10:03:41.990721 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-httpd/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.012816 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b5fb6fc5c-pj985_82fb8a17-1c35-415a-8a5d-478730286eb1/neutron-api/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.091029 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd9m4_d37d0afe-ad64-4616-b877-bd05deefd038/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.612380 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-log/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.680002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_19096ebe-3796-4e22-a477-45d3e635a80a/nova-cell0-conductor-conductor/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.842613 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_90b06d82-9f07-4c29-9bad-987d2c6d027c/nova-cell1-conductor-conductor/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.941564 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_14ae000e-33d5-4caa-8b61-dd1ab03b9978/nova-api-api/0.log" Mar 10 10:03:42 crc kubenswrapper[4883]: I0310 10:03:42.986938 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2c5d710c-62fb-4a8c-8a5c-ec6709017c75/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.118452 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-47dxf_af134b73-8c24-4b9e-b15e-48ff4b83ecd4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.310789 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-log/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.623896 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.629146 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_626b3115-ced1-45ea-8401-e2bd7e79a20c/nova-scheduler-scheduler/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.805751 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/mysql-bootstrap/0.log" Mar 10 10:03:43 crc kubenswrapper[4883]: I0310 10:03:43.826129 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_287f174d-514a-4c8c-a70e-b6e64fe41653/galera/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.004336 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.260282 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/mysql-bootstrap/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.290906 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_5dae6834-0ed6-4043-9efe-91745925591a/galera/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.485251 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0743bd84-b1d5-4634-9a7f-2c9daf2a5994/nova-metadata-metadata/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.584916 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_166b0c95-d44f-41e4-b27a-01e549dfb9d2/openstackclient/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.630626 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lb2z9_6691939e-adb0-420c-bf9e-f4a9b670c83b/ovn-controller/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.793729 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.840600 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-b2z2p_570aed6d-03dc-4ad5-b0e1-c6efc4facabb/openstack-network-exporter/0.log" Mar 10 10:03:44 crc kubenswrapper[4883]: I0310 10:03:44.982559 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovs-vswitchd/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.052024 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.077079 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qrl4s_28145780-82a1-453f-be56-b22c635f027e/ovsdb-server-init/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.178003 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7cqkz_bbcde384-73a5-48c3-a5fb-226d671707cb/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.265028 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/ovn-northd/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.302913 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b47099e9-f945-4873-a704-ee55b0f0ac46/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.474877 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.498432 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_be383ddb-b33d-4129-acf8-1ffbbc21b1d4/ovsdbserver-nb/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.622661 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/openstack-network-exporter/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.647711 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249a9bf5-ef0f-4209-855e-3fa422106519/ovsdbserver-sb/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.861849 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-log/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.889904 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5946656968-5mzlm_309c3af5-db30-48b8-8118-471950b7312c/placement-api/0.log" Mar 10 10:03:45 crc kubenswrapper[4883]: I0310 10:03:45.906933 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.115644 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.190080 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_170c41ad-d10f-4567-97ec-2b90d149951b/rabbitmq/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.196745 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.381072 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/rabbitmq/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.398834 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1be05788-71cf-486a-8142-e317e959bfe9/setup-container/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.488509 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b8pxz_0efdf39d-2133-4aaf-9fec-2b50533d3cae/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.686903 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pf4n9_d3461a81-abbe-4c3e-88ca-42eff1eeb14e/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.715094 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jj4s7_4f4d24a3-c6d6-4707-a8cc-22ef696b5aa9/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.877709 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rlqjc_61bb4cc5-1d4f-4439-a00e-4b2e27d4802b/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:46 crc kubenswrapper[4883]: I0310 10:03:46.925684 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-v5v84_caa69332-97ab-4629-900f-1596af363ba4/ssh-known-hosts-edpm-deployment/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.169802 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.182108 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-n4vhh_cbe93226-96c7-4854-abdc-4afe54ad7ad5/swift-ring-rebalance/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.225064 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6656f7cc-nv5pp_ba990cdc-e8f5-4875-a8f8-bb8f9829ba3b/proxy-httpd/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.448585 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.448653 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.550124 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.577520 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-reaper/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.650186 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-replicator/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.767231 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/account-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.769634 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.795720 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-replicator/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.871807 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-server/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.942523 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-auditor/0.log" Mar 10 10:03:47 crc kubenswrapper[4883]: I0310 10:03:47.959525 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/container-updater/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.010245 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-expirer/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.097048 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-replicator/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.142224 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-updater/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.149458 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/object-server/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.267612 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/rsync/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.300309 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39fdf41f-a914-4d0f-8d0c-5e378567a2db/swift-recon-cron/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.386619 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-blk56_b083d3b3-edb7-4d2f-a7b7-f1275bd83fde/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.525007 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d483d791-15b3-49e7-8095-5660a9d0fdaa/tempest-tests-tempest-tests-runner/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.589712 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4d76dec9-afd2-4850-aacb-c8d60819fc1e/test-operator-logs-container/0.log" Mar 10 10:03:48 crc kubenswrapper[4883]: I0310 10:03:48.756604 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zd7kp_20e06399-dd26-4a60-a6b7-261cc4505a92/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 10:03:59 crc kubenswrapper[4883]: I0310 10:03:59.361523 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_52bdcacc-ce19-418b-871c-35482038da29/memcached/0.log" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.138612 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139209 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139228 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139243 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139249 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139259 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-utilities" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139266 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-utilities" Mar 10 10:04:00 crc kubenswrapper[4883]: E0310 10:04:00.139288 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-content" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139294 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="extract-content" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139529 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="7054bc28-c5d1-41b1-a322-ba547740a357" containerName="registry-server" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.139564 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2a67cf-532a-4bf1-81d5-7e3bf4d16ce5" containerName="container-00" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.140171 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.141933 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.142265 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.142799 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.147286 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.287004 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.389695 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.406273 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"auto-csr-approver-29552284-6w2t6\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.457131 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:00 crc kubenswrapper[4883]: I0310 10:04:00.864719 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552284-6w2t6"] Mar 10 10:04:01 crc kubenswrapper[4883]: I0310 10:04:01.541599 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerStarted","Data":"1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5"} Mar 10 10:04:02 crc kubenswrapper[4883]: I0310 10:04:02.552573 4883 generic.go:334] "Generic (PLEG): container finished" podID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerID="267d7220e198001e5b7f146d949413df18a5c26f7173f9289b7707d0d7351557" exitCode=0 Mar 10 10:04:02 crc kubenswrapper[4883]: I0310 10:04:02.552665 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerDied","Data":"267d7220e198001e5b7f146d949413df18a5c26f7173f9289b7707d0d7351557"} Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.838082 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.953928 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") pod \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\" (UID: \"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e\") " Mar 10 10:04:03 crc kubenswrapper[4883]: I0310 10:04:03.961263 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq" (OuterVolumeSpecName: "kube-api-access-465nq") pod "00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" (UID: "00b10c19-3e2e-4f4a-812f-bdfaa0415a7e"). InnerVolumeSpecName "kube-api-access-465nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.058390 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-465nq\" (UniqueName: \"kubernetes.io/projected/00b10c19-3e2e-4f4a-812f-bdfaa0415a7e-kube-api-access-465nq\") on node \"crc\" DevicePath \"\"" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571103 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" event={"ID":"00b10c19-3e2e-4f4a-812f-bdfaa0415a7e","Type":"ContainerDied","Data":"1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5"} Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571164 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8589d54c69ed12440517871fa245b292af6750147579b2c23f96816d3a02e5" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.571169 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552284-6w2t6" Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.909109 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 10:04:04 crc kubenswrapper[4883]: I0310 10:04:04.918197 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552278-mmnzb"] Mar 10 10:04:06 crc kubenswrapper[4883]: I0310 10:04:06.091612 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63837d10-3c84-4972-98da-7415e14f2594" path="/var/lib/kubelet/pods/63837d10-3c84-4972-98da-7415e14f2594/volumes" Mar 10 10:04:07 crc kubenswrapper[4883]: I0310 10:04:07.282103 4883 scope.go:117] "RemoveContainer" containerID="d248878325804477b2b46157dab9cab4990cb786e7cd390c24f00599d57f6825" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.193919 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.358798 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.366828 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.384971 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.517963 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/util/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.529709 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/pull/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.552984 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_951b5e6f67e727348308cf03f2b463d3e2bd27386b453f6699e03a49babrb5g_9b8a84a3-2cd3-452c-9e28-5bfa45be11c1/extract/0.log" Mar 10 10:04:11 crc kubenswrapper[4883]: I0310 10:04:11.968799 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h2cxw_9a394c48-31ca-4e99-b210-45ae6f67faaa/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.267828 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-w9dbp_63474f68-d09d-4822-b650-96a37aead592/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.376678 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-mbxnn_bf027c79-6bdb-4cfb-8c31-d785b80e2231/manager/0.log" Mar 10 10:04:12 crc kubenswrapper[4883]: I0310 10:04:12.559092 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fvwbt_8a4cb5eb-0894-440e-8cfd-448651696a6f/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.020389 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-txdwh_884f7bcb-08ef-49f3-912b-ca921e342615/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.082059 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-nzdsk_09a04267-a914-4c55-add8-735a053038d3/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.166532 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-v6p2d_c994e4ad-140c-4655-ad69-e4013406d12e/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.326726 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v5kxw_ad93994a-26d2-4353-80be-456c1311020e/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.413279 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-dgrlb_8b177c77-d85f-4374-b6db-a700719c1282/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.735189 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-kz9sv_ec624ec4-966f-410c-95c7-73be0f9cad27/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.784358 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-snvh5_91415f40-08a2-451b-abe8-38c7b447e66f/manager/0.log" Mar 10 10:04:13 crc kubenswrapper[4883]: I0310 10:04:13.992468 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-rpwdx_760c8dff-c64a-492b-a778-45ef16d197bd/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.056757 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-49gjk_d0e08342-2d1b-42d9-921e-1d948f701a58/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.218318 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6647d7885f9f2px_2a2580ec-7e99-4eb0-95e2-9e6ca33a6a5f/manager/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.519696 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6cf8df7788-tzrb8_31e7ec33-4b44-48ce-9f01-e483a7668dd6/operator/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.765242 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-c4vjl_83852eec-509b-4074-b837-4f00d1d07d05/registry-server/0.log" Mar 10 10:04:14 crc kubenswrapper[4883]: I0310 10:04:14.866562 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-qnwgj_c13f33e2-dd6a-4ca0-91e7-5489c753e273/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.047869 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-pppd9_04b3aecb-7cfd-4042-b003-4bc8c339aff8/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.186879 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjjsn_475c1190-6d94-431a-943d-4e749ea87d6b/operator/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.302033 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m6wph_1b429bd6-00de-4cc2-8a18-9f58897b6834/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.521002 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-mkjnt_3f4c2998-b51a-4620-b674-60bb0817eb7d/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.592573 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-8mpp4_d3d3c04d-7e05-4df2-85c6-394d0bde1a69/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.727943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-rkjsw_a7216675-a296-4faa-9dd5-d857b15ffa3c/manager/0.log" Mar 10 10:04:15 crc kubenswrapper[4883]: I0310 10:04:15.838630 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6679ddfdc7-9ntl4_969b2d39-fb99-42df-8e6e-3ded5cd292c8/manager/0.log" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.448715 4883 patch_prober.go:28] interesting pod/machine-config-daemon-zxzn8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.449189 4883 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.449267 4883 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.450686 4883 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.450769 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" containerName="machine-config-daemon" containerID="cri-o://05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" gracePeriod=600 Mar 10 10:04:17 crc kubenswrapper[4883]: E0310 10:04:17.593064 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685239 4883 generic.go:334] "Generic (PLEG): container finished" podID="99873383-15b6-42ee-a65f-7917294d2e02" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" exitCode=0 Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685288 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerDied","Data":"05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24"} Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.685450 4883 scope.go:117] "RemoveContainer" containerID="02853109fcbc3c4b53e8c3e9045a6ad761bbdd1ba2cc93d6639942e6c10801e3" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.686085 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:17 crc kubenswrapper[4883]: E0310 10:04:17.687888 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:17 crc kubenswrapper[4883]: I0310 10:04:17.704623 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-q52nj_ac18771f-5f45-40d8-b275-38e2e1c48ba6/manager/0.log" Mar 10 10:04:28 crc kubenswrapper[4883]: I0310 10:04:28.080304 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:28 crc kubenswrapper[4883]: E0310 10:04:28.081410 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.104413 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dlh8b_7ec510e9-f96b-44da-abec-7d49115d0c83/control-plane-machine-set-operator/0.log" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.248813 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/kube-rbac-proxy/0.log" Mar 10 10:04:32 crc kubenswrapper[4883]: I0310 10:04:32.260349 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7clc9_3de74a75-4aa1-46dd-ae5b-5c82b91811e5/machine-api-operator/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.676668 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-kl2rd_1c0c9250-e9df-4898-bd0e-91919353a3f6/cert-manager-controller/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.811740 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-n2g9x_b92cb5d0-214a-49a6-b9b7-f210fef36956/cert-manager-cainjector/0.log" Mar 10 10:04:42 crc kubenswrapper[4883]: I0310 10:04:42.850598 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-dfhh4_f33cf1b9-ce0d-41f4-8f36-1b159badc41e/cert-manager-webhook/0.log" Mar 10 10:04:43 crc kubenswrapper[4883]: I0310 10:04:43.079981 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:43 crc kubenswrapper[4883]: E0310 10:04:43.080229 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.416243 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mr8tf_805fc4e3-bab7-415e-a190-0ceeda5bd8b7/nmstate-console-plugin/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.564106 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5lcxd_d9c7e9ee-a0a0-4afe-bd00-872553ca9b32/nmstate-handler/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.611594 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/kube-rbac-proxy/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.664916 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-x5lcq_291985dd-d623-46ba-9e1b-056dc17d26ed/nmstate-metrics/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.753520 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-k4v4s_a776287a-5b99-4f43-8d4c-191108392859/nmstate-operator/0.log" Mar 10 10:04:53 crc kubenswrapper[4883]: I0310 10:04:53.824588 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-ccbds_10ab1e00-47a1-4f9a-a55a-131935759d8d/nmstate-webhook/0.log" Mar 10 10:04:54 crc kubenswrapper[4883]: I0310 10:04:54.085143 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:04:54 crc kubenswrapper[4883]: E0310 10:04:54.085438 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:08 crc kubenswrapper[4883]: I0310 10:05:08.079703 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:08 crc kubenswrapper[4883]: E0310 10:05:08.080390 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.683065 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/kube-rbac-proxy/0.log" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.747676 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtrbh_59437559-8b42-4779-8b72-17f09b50b572/controller/0.log" Mar 10 10:05:15 crc kubenswrapper[4883]: I0310 10:05:15.896944 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.080699 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.083457 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.107998 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.131762 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.255275 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.272201 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.276501 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.290818 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.438667 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.445663 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-frr-files/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.449674 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/cp-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.464703 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/controller/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.602856 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr-metrics/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.606943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.623010 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/kube-rbac-proxy-frr/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.800521 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/reloader/0.log" Mar 10 10:05:16 crc kubenswrapper[4883]: I0310 10:05:16.836677 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-shjnr_8e843a56-715a-44fc-9974-8570d49bd9a0/frr-k8s-webhook-server/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.006698 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c79cc77cd-s6vgn_5804aa0d-ee19-4fb3-bd39-27c7103571d8/manager/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.202886 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57848ff665-prp4d_cb05036e-52f2-48ab-ba84-f89c4565a0af/webhook-server/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.320965 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/kube-rbac-proxy/0.log" Mar 10 10:05:17 crc kubenswrapper[4883]: I0310 10:05:17.746296 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gtqfn_6cecc1fd-5f20-4aff-ae03-570ef8b7dfaf/speaker/0.log" Mar 10 10:05:18 crc kubenswrapper[4883]: I0310 10:05:18.017082 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ck8gb_d2caf019-bd64-4a5c-bf88-c260178bdc82/frr/0.log" Mar 10 10:05:23 crc kubenswrapper[4883]: I0310 10:05:23.080545 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:23 crc kubenswrapper[4883]: E0310 10:05:23.081170 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.584172 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.771304 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.787082 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.812374 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.964673 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/pull/0.log" Mar 10 10:05:28 crc kubenswrapper[4883]: I0310 10:05:28.993055 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/extract/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.003337 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82p2cjd_eb61f8a4-ceed-4f2a-91fe-ead52fb416ee/util/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.132754 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.297574 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.303810 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.313448 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.463756 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.477128 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.789574 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-p7kbr_f43173ae-a262-4efa-8141-419be6d01b7d/registry-server/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.865386 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.957186 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.986100 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:29 crc kubenswrapper[4883]: I0310 10:05:29.988232 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.169890 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-utilities/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.179749 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/extract-content/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.384847 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.539136 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.563121 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.581281 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.735637 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gqg54_8e7df241-6476-44a7-a800-921897b7e381/registry-server/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.746200 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/pull/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.783254 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/extract/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.801101 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f42pdj5_8aebf63f-b8d3-496c-a660-c484d574fb63/util/0.log" Mar 10 10:05:30 crc kubenswrapper[4883]: I0310 10:05:30.999079 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d6jf_849aec1a-3ce6-4153-8e52-4bf0185e29e3/marketplace-operator/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.007777 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.140347 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.158604 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.167585 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.295774 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.315317 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.424876 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-v6jt6_790ba2f9-1214-4040-a140-0663e2b869b1/registry-server/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.493596 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.642645 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.655716 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.656139 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.851943 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-utilities/0.log" Mar 10 10:05:31 crc kubenswrapper[4883]: I0310 10:05:31.875738 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/extract-content/0.log" Mar 10 10:05:32 crc kubenswrapper[4883]: I0310 10:05:32.326192 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g87df_06556553-1ab9-4217-ad98-679ff31feaf9/registry-server/0.log" Mar 10 10:05:37 crc kubenswrapper[4883]: I0310 10:05:37.079935 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:37 crc kubenswrapper[4883]: E0310 10:05:37.080737 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:05:48 crc kubenswrapper[4883]: I0310 10:05:48.079860 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:05:48 crc kubenswrapper[4883]: E0310 10:05:48.080830 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.136432 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:00 crc kubenswrapper[4883]: E0310 10:06:00.137397 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.137413 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.137626 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b10c19-3e2e-4f4a-812f-bdfaa0415a7e" containerName="oc" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.138228 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140336 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140811 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.140992 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.147622 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.298769 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.401820 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.425096 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"auto-csr-approver-29552286-2j26x\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.467950 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:00 crc kubenswrapper[4883]: I0310 10:06:00.892230 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552286-2j26x"] Mar 10 10:06:01 crc kubenswrapper[4883]: I0310 10:06:01.079769 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:01 crc kubenswrapper[4883]: E0310 10:06:01.080134 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:01 crc kubenswrapper[4883]: I0310 10:06:01.563271 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerStarted","Data":"189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941"} Mar 10 10:06:02 crc kubenswrapper[4883]: I0310 10:06:02.573024 4883 generic.go:334] "Generic (PLEG): container finished" podID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerID="22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464" exitCode=0 Mar 10 10:06:02 crc kubenswrapper[4883]: I0310 10:06:02.573101 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerDied","Data":"22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464"} Mar 10 10:06:02 crc kubenswrapper[4883]: E0310 10:06:02.632992 4883 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c92f22_d1bc_4f9e_83b5_8b485ac02a4f.slice/crio-conmon-22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c92f22_d1bc_4f9e_83b5_8b485ac02a4f.slice/crio-22cea35a8715f6aa0a973e315de5d9ddc6700c022d27c07bee0f3733e2e64464.scope\": RecentStats: unable to find data in memory cache]" Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.895563 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.985550 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") pod \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\" (UID: \"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f\") " Mar 10 10:06:03 crc kubenswrapper[4883]: I0310 10:06:03.991515 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj" (OuterVolumeSpecName: "kube-api-access-9rsxj") pod "82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" (UID: "82c92f22-d1bc-4f9e-83b5-8b485ac02a4f"). InnerVolumeSpecName "kube-api-access-9rsxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.089559 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rsxj\" (UniqueName: \"kubernetes.io/projected/82c92f22-d1bc-4f9e-83b5-8b485ac02a4f-kube-api-access-9rsxj\") on node \"crc\" DevicePath \"\"" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591078 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552286-2j26x" event={"ID":"82c92f22-d1bc-4f9e-83b5-8b485ac02a4f","Type":"ContainerDied","Data":"189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941"} Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591128 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="189fa60d8ccc036fb6052be568d5e610d11a3aed4e2104c741d8f46e9a41f941" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.591194 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552286-2j26x" Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.952651 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:06:04 crc kubenswrapper[4883]: I0310 10:06:04.961581 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552280-8d8wl"] Mar 10 10:06:06 crc kubenswrapper[4883]: I0310 10:06:06.090016 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b2dc78-bc43-4cf8-a946-509772bb2522" path="/var/lib/kubelet/pods/f7b2dc78-bc43-4cf8-a946-509772bb2522/volumes" Mar 10 10:06:12 crc kubenswrapper[4883]: I0310 10:06:12.081376 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:12 crc kubenswrapper[4883]: E0310 10:06:12.082271 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:24 crc kubenswrapper[4883]: I0310 10:06:24.086188 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:24 crc kubenswrapper[4883]: E0310 10:06:24.087127 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:36 crc kubenswrapper[4883]: I0310 10:06:36.083012 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:36 crc kubenswrapper[4883]: E0310 10:06:36.084028 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:06:49 crc kubenswrapper[4883]: I0310 10:06:49.079926 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:06:49 crc kubenswrapper[4883]: E0310 10:06:49.081805 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.180053 4883 generic.go:334] "Generic (PLEG): container finished" podID="20b00153-6497-4507-8247-81caa30a91bc" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" exitCode=0 Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.180171 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8cw7n/must-gather-29w7p" event={"ID":"20b00153-6497-4507-8247-81caa30a91bc","Type":"ContainerDied","Data":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.182059 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:02 crc kubenswrapper[4883]: I0310 10:07:02.673884 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/gather/0.log" Mar 10 10:07:04 crc kubenswrapper[4883]: I0310 10:07:04.084850 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:04 crc kubenswrapper[4883]: E0310 10:07:04.085335 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:07 crc kubenswrapper[4883]: I0310 10:07:07.402613 4883 scope.go:117] "RemoveContainer" containerID="b6eef7765d3e9379f71ea8bde62b6fb0aea46e71cd54c2a9f6788a8b9c14680a" Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.798101 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.798909 4883 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8cw7n/must-gather-29w7p" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" containerID="cri-o://212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" gracePeriod=2 Mar 10 10:07:12 crc kubenswrapper[4883]: I0310 10:07:12.806039 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8cw7n/must-gather-29w7p"] Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.200919 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/copy/0.log" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.201745 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.284597 4883 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8cw7n_must-gather-29w7p_20b00153-6497-4507-8247-81caa30a91bc/copy/0.log" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285011 4883 generic.go:334] "Generic (PLEG): container finished" podID="20b00153-6497-4507-8247-81caa30a91bc" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" exitCode=143 Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285082 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8cw7n/must-gather-29w7p" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.285092 4883 scope.go:117] "RemoveContainer" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.288803 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") pod \"20b00153-6497-4507-8247-81caa30a91bc\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.288913 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") pod \"20b00153-6497-4507-8247-81caa30a91bc\" (UID: \"20b00153-6497-4507-8247-81caa30a91bc\") " Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.295112 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx" (OuterVolumeSpecName: "kube-api-access-qphvx") pod "20b00153-6497-4507-8247-81caa30a91bc" (UID: "20b00153-6497-4507-8247-81caa30a91bc"). InnerVolumeSpecName "kube-api-access-qphvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.305405 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.392362 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphvx\" (UniqueName: \"kubernetes.io/projected/20b00153-6497-4507-8247-81caa30a91bc-kube-api-access-qphvx\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.427354 4883 scope.go:117] "RemoveContainer" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: E0310 10:07:13.434082 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": container with ID starting with 212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183 not found: ID does not exist" containerID="212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.434145 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183"} err="failed to get container status \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": rpc error: code = NotFound desc = could not find container \"212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183\": container with ID starting with 212ae0483aea6aa4e97ccfce5e5df2f442fa44a352eb100704c900ad5974c183 not found: ID does not exist" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.434175 4883 scope.go:117] "RemoveContainer" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: E0310 10:07:13.436789 4883 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": container with ID starting with 57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4 not found: ID does not exist" containerID="57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.436818 4883 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4"} err="failed to get container status \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": rpc error: code = NotFound desc = could not find container \"57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4\": container with ID starting with 57adb3e772bd1b0831c0c76f2621af8c245cea26b27d8eed4602c8e6c2fe21d4 not found: ID does not exist" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.463849 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "20b00153-6497-4507-8247-81caa30a91bc" (UID: "20b00153-6497-4507-8247-81caa30a91bc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 10:07:13 crc kubenswrapper[4883]: I0310 10:07:13.495558 4883 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/20b00153-6497-4507-8247-81caa30a91bc-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 10:07:14 crc kubenswrapper[4883]: I0310 10:07:14.093965 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b00153-6497-4507-8247-81caa30a91bc" path="/var/lib/kubelet/pods/20b00153-6497-4507-8247-81caa30a91bc/volumes" Mar 10 10:07:19 crc kubenswrapper[4883]: I0310 10:07:19.080825 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:19 crc kubenswrapper[4883]: E0310 10:07:19.081440 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:30 crc kubenswrapper[4883]: I0310 10:07:30.080428 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:30 crc kubenswrapper[4883]: E0310 10:07:30.081187 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:44 crc kubenswrapper[4883]: I0310 10:07:44.086969 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:44 crc kubenswrapper[4883]: E0310 10:07:44.087873 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:07:57 crc kubenswrapper[4883]: I0310 10:07:57.079867 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:07:57 crc kubenswrapper[4883]: E0310 10:07:57.080608 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.144634 4883 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145598 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145614 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145636 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145643 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: E0310 10:08:00.145685 4883 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145691 4883 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145880 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="gather" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145901 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b00153-6497-4507-8247-81caa30a91bc" containerName="copy" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.145917 4883 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c92f22-d1bc-4f9e-83b5-8b485ac02a4f" containerName="oc" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.146720 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.148670 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.148671 4883 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.149138 4883 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-9jjmk" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.156755 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.191153 4883 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.292906 4883 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.314106 4883 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"auto-csr-approver-29552288-4tlx2\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.471832 4883 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:00 crc kubenswrapper[4883]: I0310 10:08:00.863456 4883 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552288-4tlx2"] Mar 10 10:08:01 crc kubenswrapper[4883]: I0310 10:08:01.771648 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerStarted","Data":"777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320"} Mar 10 10:08:02 crc kubenswrapper[4883]: I0310 10:08:02.784557 4883 generic.go:334] "Generic (PLEG): container finished" podID="b7688fb5-21b4-4c69-be24-7182936f25c6" containerID="414d36a21e0b760297ba35c1fb5d7c1ceb38d2bb928b62f2d14e133f97df7d40" exitCode=0 Mar 10 10:08:02 crc kubenswrapper[4883]: I0310 10:08:02.784633 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerDied","Data":"414d36a21e0b760297ba35c1fb5d7c1ceb38d2bb928b62f2d14e133f97df7d40"} Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.084983 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.276698 4883 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") pod \"b7688fb5-21b4-4c69-be24-7182936f25c6\" (UID: \"b7688fb5-21b4-4c69-be24-7182936f25c6\") " Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.283234 4883 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc" (OuterVolumeSpecName: "kube-api-access-zzrdc") pod "b7688fb5-21b4-4c69-be24-7182936f25c6" (UID: "b7688fb5-21b4-4c69-be24-7182936f25c6"). InnerVolumeSpecName "kube-api-access-zzrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.379530 4883 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzrdc\" (UniqueName: \"kubernetes.io/projected/b7688fb5-21b4-4c69-be24-7182936f25c6-kube-api-access-zzrdc\") on node \"crc\" DevicePath \"\"" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.802948 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" event={"ID":"b7688fb5-21b4-4c69-be24-7182936f25c6","Type":"ContainerDied","Data":"777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320"} Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.802993 4883 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777e00b0fa3a80d69c134549338f80748efa319fd4ca35a74dc05f74fa9f8320" Mar 10 10:08:04 crc kubenswrapper[4883]: I0310 10:08:04.803049 4883 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552288-4tlx2" Mar 10 10:08:05 crc kubenswrapper[4883]: I0310 10:08:05.139770 4883 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:08:05 crc kubenswrapper[4883]: I0310 10:08:05.146692 4883 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552282-pdd7d"] Mar 10 10:08:06 crc kubenswrapper[4883]: I0310 10:08:06.092941 4883 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ca7076-03d6-4598-a451-cf485909b9fc" path="/var/lib/kubelet/pods/38ca7076-03d6-4598-a451-cf485909b9fc/volumes" Mar 10 10:08:07 crc kubenswrapper[4883]: I0310 10:08:07.468101 4883 scope.go:117] "RemoveContainer" containerID="a0e857f4b7d8648de7f831deee2dafbed19a142ef98ff1e54d0826fe04524086" Mar 10 10:08:08 crc kubenswrapper[4883]: I0310 10:08:08.080420 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:08 crc kubenswrapper[4883]: E0310 10:08:08.081508 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:22 crc kubenswrapper[4883]: I0310 10:08:22.079652 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:22 crc kubenswrapper[4883]: E0310 10:08:22.080705 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:34 crc kubenswrapper[4883]: I0310 10:08:34.087387 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:34 crc kubenswrapper[4883]: E0310 10:08:34.088779 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:47 crc kubenswrapper[4883]: I0310 10:08:47.079917 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:47 crc kubenswrapper[4883]: E0310 10:08:47.080876 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:08:58 crc kubenswrapper[4883]: I0310 10:08:58.080085 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:08:58 crc kubenswrapper[4883]: E0310 10:08:58.081002 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:09:08 crc kubenswrapper[4883]: I0310 10:09:08.037600 4883 scope.go:117] "RemoveContainer" containerID="3bd8ffbe632d745f21bae1bcada8aafee95f3789b72071a8dc962e253bd69366" Mar 10 10:09:08 crc kubenswrapper[4883]: I0310 10:09:08.059623 4883 scope.go:117] "RemoveContainer" containerID="5f767fd40e7a1f328bfb19f2b7318e01d59fdbb800fcd79f270e6c2ebac7a271" Mar 10 10:09:11 crc kubenswrapper[4883]: I0310 10:09:11.080340 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:09:11 crc kubenswrapper[4883]: E0310 10:09:11.081406 4883 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zxzn8_openshift-machine-config-operator(99873383-15b6-42ee-a65f-7917294d2e02)\"" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" podUID="99873383-15b6-42ee-a65f-7917294d2e02" Mar 10 10:09:24 crc kubenswrapper[4883]: I0310 10:09:24.089208 4883 scope.go:117] "RemoveContainer" containerID="05d2033587521f2d4a7373fae9690003b36ba50aa52ad4334abc4ba5538e3c24" Mar 10 10:09:24 crc kubenswrapper[4883]: I0310 10:09:24.600748 4883 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zxzn8" event={"ID":"99873383-15b6-42ee-a65f-7917294d2e02","Type":"ContainerStarted","Data":"b8cc001f4c760929811289dd62bac577b874780c74323643ab5e9dd20562531d"}